deterministic behavior
Recently Published Documents


TOTAL DOCUMENTS

51
(FIVE YEARS 13)

H-INDEX

11
(FIVE YEARS 1)

Author(s):  
Lars Kegel ◽  
Claudio Hartmann ◽  
Maik Thiele ◽  
Wolfgang Lehner

AbstractProcessing and analyzing time series datasets have become a central issue in many domains requiring data management systems to support time series as a native data type. A core access primitive of time series is matching, which requires efficient algorithms on-top of appropriate representations like the symbolic aggregate approximation (SAX) representing the current state of the art. This technique reduces a time series to a low-dimensional space by segmenting it and discretizing each segment into a small symbolic alphabet. Unfortunately, SAX ignores the deterministic behavior of time series such as cyclical repeating patterns or a trend component affecting all segments, which may lead to a sub-optimal representation accuracy. We therefore introduce a novel season- and a trend-aware symbolic approximation and demonstrate an improved representation accuracy without increasing the memory footprint. Most importantly, our techniques also enable a more efficient time series matching by providing a match up to three orders of magnitude faster than SAX.


Author(s):  
Saiful Bahri Hisamudin Et.al

Uncontrollability is troublesome for unit testing. It causes a non-deterministic behavior where the same input can produce different results based on different executions. The non-deterministic characteristic makes it impossible to test the internal logic of a method because it suffers from tight coupling, a single responsibility principle violation, being an untestable code, being non-reusable or hard to maintain. This paper describes a tool, namely the non-deterministic Code Detection Tool (nCODET) that aims to assist novice developers to write testable codes by avoiding the non-deterministic characteristic in their codes. Our research focuses on the unit testability of classes; particularly the effort involvedin constructing unit test cases.


Electronics ◽  
2021 ◽  
Vol 10 (6) ◽  
pp. 689
Author(s):  
Tom Springer ◽  
Elia Eiroa-Lledo ◽  
Elizabeth Stevens ◽  
Erik Linstead

As machine learning becomes ubiquitous, the need to deploy models on real-time, embedded systems will become increasingly critical. This is especially true for deep learning solutions, whose large models pose interesting challenges for target architectures at the “edge” that are resource-constrained. The realization of machine learning, and deep learning, is being driven by the availability of specialized hardware, such as system-on-chip solutions, which provide some alleviation of constraints. Equally important, however, are the operating systems that run on this hardware, and specifically the ability to leverage commercial real-time operating systems which, unlike general purpose operating systems such as Linux, can provide the low-latency, deterministic execution required for embedded, and potentially safety-critical, applications at the edge. Despite this, studies considering the integration of real-time operating systems, specialized hardware, and machine learning/deep learning algorithms remain limited. In particular, better mechanisms for real-time scheduling in the context of machine learning applications will prove to be critical as these technologies move to the edge. In order to address some of these challenges, we present a resource management framework designed to provide a dynamic on-device approach to the allocation and scheduling of limited resources in a real-time processing environment. These types of mechanisms are necessary to support the deterministic behavior required by the control components contained in the edge nodes. To validate the effectiveness of our approach, we applied rigorous schedulability analysis to a large set of randomly generated simulated task sets and then verified the most time critical applications, such as the control tasks which maintained low-latency deterministic behavior even during off-nominal conditions. The practicality of our scheduling framework was demonstrated by integrating it into a commercial real-time operating system (VxWorks) then running a typical deep learning image processing application to perform simple object detection. The results indicate that our proposed resource management framework can be leveraged to facilitate integration of machine learning algorithms with real-time operating systems and embedded platforms, including widely-used, industry-standard real-time operating systems.


2021 ◽  
Author(s):  
Yan Yan ◽  
Wenxuan Xu ◽  
Sandip Kumar ◽  
Alexander Zhang ◽  
Fenfei Leng ◽  
...  

Protein-mediated DNA looping is a fundamental mechanism of gene regulation. Such loops occur stochastically, and a calibrated response to environmental stimuli would seem to require more deterministic behavior, so experiments were preformed to determine whether additional proteins and/or DNA supercoiling might be definitive. In experiments on DNA looping mediated by the Escherichia coli lac repressor protein, increasing compaction by the nucleoid-associated protein, HU, progressively increased the average looping probability for an ensemble of single molecules. Despite this trend, the looping probabilities associated with individual molecules ranged from 0 to 100 throughout the titration, and observations of a single molecule for an hour or longer were required to observe the statistical looping behavior of the ensemble, ergodicity. Increased negative supercoiling also increased the looping probability for an ensemble of molecules, but the looping probabilities of individual molecules more closely resembled the ensemble average. Furthermore, supercoiling accelerated the loop dynamics such that in as little as twelve minutes of observation most molecules exhibited the looping probability of the ensemble. Notably, this is within the timescale of the doubling time of the bacterium. DNA supercoiling, an inherent feature of genomes across kingdoms, appears to be a fundamental determinant of time-constrained, emergent behavior in otherwise random molecular activity.


2021 ◽  
Vol 33 (5) ◽  
pp. 117-136
Author(s):  
Sergey Evgenievich Prokopev

Use of the formal methods in the development of the protocol implementations improves the quality of these implementations. The greatest benefit would result from the formalizing of the primary specifications usually contained in the RFC documents. This paper proposes a formal language for primary specifications of the cryptographic protocols, which aims at fulfilling (in a higher degree than in the existing approaches) the conditions required from primary specifications: they have to be concise, declarative, expressive, unambiguous and executable; in addition, the tools supporting the language have to provide the possibility of automatic deriving of the high quality test suites from the specifications. The proposed language is based on a machine (dubbed the C2-machine) specifically designed for the domain of the cryptographic protocols. Protocol specification is defined as a program of the C2-machine. This program consists of two parts: the definition of the protocol packets and the definition of the non-deterministic behavior of the protocol parties. One execution of the program simulates one run of the protocol. All the traces, which can be potentially produced by such execution, by definition, comprise all conformant traces of the protocol; in other words, the program of the C2-machine defines the operational contract of the protocol. In the paper, to make the design and operational principles of the C2-machine easier to understand, two abstractions of the C2‑machine are presented: C0-machine and C1-machine. C0-machine is used to explain the approach taken in expressing non-determinism of the protocols. The abstraction level of the C1-machine (which is a refinement of C0-machine) is enough to define the semantics of the basic C2-machine instructions. To enhance the readability of the programs and to reach the level of declarativeness and conciseness of the formalized notations used in some of conventional RFCs (e.g. TLS RFCs), C2-machine implements some syntactic tricks on top of the basic instructions. To use C2-specifications in the black-box testing, the special form of the C2-machine (C2-machine with excluded role) is presented. Throughout the paper the proposed concepts are illustrated by examples from the TLS protocol.


2020 ◽  
Vol 14 ◽  
Author(s):  
Pierre Bonzon

Living organisms have either innate or acquired mechanisms for reacting to percepts with an appropriate behavior e.g., by escaping from the source of a perception detected as threat, or conversely by approaching a target perceived as potential food. In the case of artifacts, such capabilities must be built in through either wired connections or software. The problem addressed here is to define a neural basis for such behaviors to be possibly learned by bio-inspired artifacts. Toward this end, a thought experiment involving an autonomous vehicle is first simulated as a random search. The stochastic decision tree that drives this behavior is then transformed into a plastic neuronal circuit. This leads the vehicle to adopt a deterministic behavior by learning and applying a causality rule just as a conscious human driver would do. From there, a principle of using synchronized multimodal perceptions in association with the Hebb principle of wiring together neuronal cells is induced. This overall framework is implemented as a virtual machine i.e., a concept widely used in software engineering. It is argued that such an interface situated at a meso-scale level between abstracted micro-circuits representing synaptic plasticity, on one hand, and that of the emergence of behaviors, on the other, allows for a strict delineation of successive levels of complexity. More specifically, isolating levels allows for simulating yet unknown processes of cognition independently of their underlying neurological grounding.


Author(s):  
Dmitri V. Alexandrov ◽  
Irina A. Bashkirtseva ◽  
Lev B. Ryashko ◽  
Michel Crucifix

2020 ◽  
Vol 9 (9) ◽  
pp. e637997737 ◽  
Author(s):  
Leika Irabele Tenório de Santana ◽  
Antonio Samuel Alves da Silva ◽  
Rômulo Simões Cezar Menezes ◽  
Tatijana Stosic

Precipitation is the main climatic variable that is used for modeling risks indices for natural disasters. We investigated nonlinear dynamics of monthly rainfall temporal series recorded from 1962 to 2012, at three stations in Pernambuco state, Brazil, located in regions with different rainfall regime (Zona da Mata, Agreste and Sertão), provided by the Meteorological Laboratory of the Institute of Technology of Pernambuco (Laboratório de Meteorologia do Instituto de Tecnologia de Pernambuco – LAMEP/ITEP). The objective of this work is to contribute to a better understanding of the spatiotemporal distribution of rainfall in the state of Pernambuco. We use the methodology from nonlinear dynamics theory, Recurrence plot (RP) that allows to distinguish between different types of underlying processes. The results showed that rainfall regime in deep inland semiarid Sertão region is characterized by weaker and less complex deterministic behavior, comparing to Zona da Mata and Agreste, where we identified transitions between chaotic and nonstationary type of dynamics. For transitional Agreste region rainfall dynamics showed stronger memory with longer mean prediction time, while for sub humid Zona da Mata rainfall dynamics is characterized by laminar (slowly changing) states.


2020 ◽  
Author(s):  
Michel Crucifix ◽  
Dmitri Alexandrov ◽  
irina Bashkirtseva ◽  
Lev Ryashko

<p>Glacial-interglacial cycles are global climatic changes which have characterised the last 3 million years. The eight latest<br>glacial-interglacial cycles represent changes in sea level over 100 m, and their average duration was around 100 000 years. There is a<br>long tradition of modelling glacial-interglacial cycles with low-order dynamical systems. In one view, the cyclic phenomenon is caused by<br>non-linear interactions between components of the climate system: The dynamical system model which represents Earth dynamics has a limit cycle. In an another view, the variations in ice volume and ice sheet extent are caused by changes in Earth's orbit, possibly amplified by feedbacks.<br>This response and internal feedbacks need to be non-linear to explain the asymmetric character of glacial-interglacial cycles and their duration. A third view sees glacial-interglacial cycles as a limit cycle synchronised on the orbital forcing.</p><p>The purpose of the present contribution is to pay specific attention to the effects of stochastic forcing. Indeed, the trajectories<br>obtained in presence of noise are not necessarily noised-up versions of the deterministic trajectories. They may follow pathways which<br>have no analogue in the deterministic version of the model.  Our purpose is to<br>demonstrate the mechanisms by which stochastic excitation may generate such large-scale oscillations and induce intermittency. To this end, we<br>consider a series of models previously introduced in the literature, starting by autonomous models with two variables, and then three<br>variables. The properties of stochastic trajectories are understood by reference to the bifurcation diagram, the vector field, and a<br>method called stochastic sensitivity analysis.  We then introduce models accounting for the orbital forcing, and distinguish forced and<br>synchronised ice-age scenarios, and show again how noise may generate trajectories which have no immediate analogue in the determinstic model. </p>


Sign in / Sign up

Export Citation Format

Share Document