Abstracting timing information in UML state charts via temporal ordering and LOTOS

Author(s):  
Valentine Chimisliu ◽  
Franz Wotawa
2018 ◽  
Vol 26 (2) ◽  
Author(s):  
Dean A. Forbes

In a recent essay published in this journal, I illustrated the limitations one may encounter when sequencing texts temporally using s-curve analysis. I also introduced seriation, a more reliable method for temporal ordering much used in both archaeology and computational biology. Lacking independently ordered Biblical Hebrew (BH) data to assess the potential power of seriation in the context of diachronic studies, I used classic Middle English data originally compiled by Ellegård. In this addendum, I reintroduce and extend s-curve analysis, applying it to one rather noisy feature of Middle English. My results support Holmstedt’s assertion that s-curve analysis can be a useful diagnostic tool in diachronic studies. Upon quantitative comparison, however, the five-feature seriation results derived in my former paper are found to be seven times more accurate than the single-feature s-curve results presented here. 


Electronics ◽  
2021 ◽  
Vol 10 (13) ◽  
pp. 1587
Author(s):  
Duo Sheng ◽  
Hsueh-Ru Lin ◽  
Li Tai

High performance and complex system-on-chip (SoC) design require a throughput and stable timing monitor to reduce the impacts of uncertain timing and implement the dynamic voltage and frequency scaling (DVFS) scheme for overall power reduction. This paper presents a multi-stage timing monitor, combining three timing-monitoring stages to achieve a high timing-monitoring resolution and a wide timing-monitoring range simultaneously. Additionally, because the proposed timing monitor has high immunity to the process–voltage–temperature (PVT) variation, it provides a more stable time-monitoring results. The time-monitoring resolution and range of the proposed timing monitor are 47 ps and 2.2 µs, respectively, and the maximum measurement error is 0.06%. Therefore, the proposed multi-stage timing monitor provides not only the timing information of the specified signals to maintain the functionality and performance of the SoC, but also makes the operation of the DVFS scheme more efficient and accurate in SoC design.


2020 ◽  
Vol 91 (3) ◽  
pp. 1531-1541
Author(s):  
Paul G. Richards ◽  
Margaret Hellweg

Abstract Quantitative seismology is based firmly on the analysis of actual ground motions, and the transition to digital recording in the 1980s enabled sophisticated new capabilities to extract useful results from waveforms. With some effort, these tools can also be applied to analog records. Focusing on assets available within U.S. institutions, we review the necessary steps and the challenges in enabling “data rescue”—that is, preserving the scientific information latent in large analog seismogram archives and making it usable. They include: determining what assets are available (the analog seismogram archives held by various institutions, with associated metadata on instrument responses, station locations, and timing information); developing a consensus on the top level of a triage process (which analog records most definitely should be rescued?); deciding the level of quality needed in copying original seismograms to media suitable for digitizing; assessing the relative merits of scanning and digitizing; and, the need for a community service in distributing scans and digital records, as they accumulate. The necessary level of effort can benefit from practical experience. For example, specific studies have used digitized versions of analog recordings to model earthquake sources and assess seismic hazard. Other studies have used them to gain experience with nuclear explosion signals recorded at regional distances, noting that regional signals enable explosions to be monitored down to levels much lower than those attainable teleseismically. The opportunities presented by large archives of analog seismograms include the insights they present to current and future seismologists studying earthquakes and explosions, into the practical areas of assessing seismic hazard, monitoring for test ban compliance down to low explosion yields—and prompt characterization of actual explosions should they occur, as well the traditional academic pursuit of a better understanding of earthquake physics.


2014 ◽  
Vol 4 (1) ◽  
Author(s):  
V. Uma ◽  
G. Aghila

AbstractOWL (Web Ontology Language) is the standard language for Semantic Web and is used in defining ontologies for Web. Temporal event data are ubiquitous in nature. Temporal data can be represented qualitatively using temporal relations in OWL, enabling temporal ordering of events which plays a vital role in task planners. The basic Allen’s temporal interval relations can be used to describe relations in OWL. Allen’s interval algebra is a well known formalism used to represent and reason the temporal knowledge. In this work, Allen’s interval algebra is extended by Reference Event based Temporal (REseT) relations to reduce the ambiguity in the before relation. The extended formalism is used in the representation of relations between time intervals and the viability of ordering of events in ontology is elucidated. This paper proposes a temporal knowledge representation and reasoning based event ordering system which helps in the temporal ordering of events. The advantage of this method is that it does not introduce any additional constructs in OWL and hence the existing reasoning tools and DL based query languages are capable of generating the linear order of events. The system is investigated experimentally using the COW (Correlates of War) dataset and has been evaluated using the Percent_ Similarity measure.


Sign in / Sign up

Export Citation Format

Share Document