sampling frequency
Recently Published Documents


TOTAL DOCUMENTS

1008
(FIVE YEARS 281)

H-INDEX

39
(FIVE YEARS 5)

2022 ◽  
Author(s):  
Matthew Bailey ◽  
Mark Wilson

One of the critical tools of persistent homology is the persistence diagram. We demonstrate the applicability of a persistence diagram showing the existence of topological features (here rings in a 2D network) generated over time instead of space as a tool to analyse trajectories of biological networks. We show how the time persistence diagram is useful in order to identify critical phenomena such as rupturing and to visualise important features in 2D biological networks; they are particularly useful to highlight patterns of damage and to identify if particular patterns are significant or ephemeral. Persistence diagrams are also used to analyse repair phenomena, and we explore how the measured properties of a dynamical phenomenon change according to the sampling frequency. This shows that the persistence diagrams are robust and still provide useful information even for data of low temporal resolution. Finally, we combine persistence diagrams across many trajectories to show how the technique highlights the existence of sharp transitions at critical points in the rupturing process.


Sensors ◽  
2022 ◽  
Vol 22 (1) ◽  
pp. 332
Author(s):  
Emilio García ◽  
Neisser Ponluisa ◽  
Eduardo Quiles ◽  
Ranko Zotovic-Stanisic ◽  
Santiago C. Gutiérrez

This work proposes a method for real-time supervision and predictive fault diagnosis applicable to solar panel strings in real-world installations. It is focused on the detection and parametric isolation of fault symptoms through the analysis of the Voc-Isc curves. The method performs early, systematic, online, automatic, permanent predictive supervision, and diagnosis of a high sampling frequency. It is based on the supervision of predictive electrical parameters easily accessible by the design of its architecture, whose detection and isolation precedes with an adequate margin of maneuver, to be able to alert and stop by means of automatic disconnection the degradation phenomenon and its cumulative effect causing the development of a future irrecoverable failure. Its architecture design is scalable and integrable in conventional photovoltaic installations. It emphasizes the use of low-cost technology such as the ESP8266 module, ASC712-5A, and FZ0430 sensors and relay modules. The method is based on data acquisition with the ESP8266 module, which is sent over the internet to the computer where a SCADA system (iFIX V6.5) is installed, using the Modbus TCP/IP and OPC communication protocols. Detection thresholds are initially obtained experimentally by applying inductive shading methods on specific solar panels.


2021 ◽  
Vol 20 ◽  
pp. 320-323
Author(s):  
Vaclav Skala

Cubic parametric curves are used in many applications including the CAD/CAM systems. Especially the Hermite, Bezier and Coons formulations of a cubic parametric curve are used in E2 and E3 space. This paper presents efficient algorithm for the intersection computation of a cubic parametric curve with the Axis Aligned Bounding Box (AAB Box). Usual solution is to represent the cubic curve by a polyline, i.e. actually by sampled points of the given curve. However, this approach is dependent on the sampling frequency and can lead to problems especially in CAD/CAM systems and numerically controlled machines use.


Mathematics ◽  
2021 ◽  
Vol 10 (1) ◽  
pp. 5
Author(s):  
Mao Chen ◽  
Guanqi Liu ◽  
Yuwen Wang

At present, the study concerning pricing variance swaps under CIR the (Cox–Ingersoll–Ross)–Heston hybrid model has achieved many results ; however, due to the instantaneous interest rate and instantaneous volatility in the model following the Feller square root process, only a semi-closed solution can be obtained by solving PDEs. This paper presents a simplified approach to price log-return variance swaps under the CIR–Heston hybrid model. Compared with Cao’s work, an important feature of our approach is that there is no need to solve complex PDEs; a closed-form solution is obtained by applying the martingale theory and Ito^’s lemma. The closed-form solution is significant because it can achieve accurate pricing and no longer takes time to adjust parameters by numerical method. Another significant feature of this paper is that the impact of sampling frequency on pricing formula is analyzed; then the closed-form solution can be extended to an approximate formula. The price curves of the closed-form solution and the approximate solution are presented by numerical simulation. When the sampling frequency is large enough, the two curves almost coincide, which means that our approximate formula is simple and reliable.


2021 ◽  
Vol 2021 ◽  
pp. 1-12
Author(s):  
Jiantao Mu ◽  
Yin Han ◽  
Cheng Zhang ◽  
Jiao Yao ◽  
Jing Zhao

On-board data of detected vehicles play a critical role in the management of urban road traffic operation and the estimation of traffic status. Unfortunately, due to limitations of technology and privacy issues, the sampling frequency of the detected vehicle data is low and the coverage is also limited. Continuous vehicle trajectories cannot be obtained. To overcome the above problems, this paper proposes an unscented Kalman filter (UKF)-based method to reconstruct the trajectories at signalized intersections using sparse probe data of vehicles. We first divide the intersection into multiple road sections and use a quadratic programming problem to estimate the travel time of each section. The weight of each initial possible trajectory is calculated separately, and the trajectory is updated using the unscented Kalman filter (UKF); then, the trajectory between two updates is also obtained accordingly. Finally, the method is applied to the actual scenario provided by the NGSIM data and compared with the real trajectory. The mean absolute error (MAE) is adopted to evaluate the accuracy of the proposed trajectory reconstruction. Sensitivity analysis is provided in order to provide the requirement of sampling frequency to obtain highly accurate reconstructed vehicle trajectories under this method. The results demonstrate the applicability of the technique to the signalized intersection. Therefore, the method enables us to obtain richer and more accurate trajectory data information, providing a strong prior basis for future urban road traffic management and scholars using trajectory data for various studies.


2021 ◽  
Vol 13 (12) ◽  
pp. 5819-5830
Author(s):  
Xuebo Li ◽  
Yongxiang Huang ◽  
Guohua Wang ◽  
Xiaojing Zheng

Abstract. Partially due to global climate change, sand and dust storms (SDSs) have occurred more and more frequently, yet a detailed measurement of SDS events at different heights is still lacking. Here we provide a high-frequency observation from the Qingtu Lake Observation Array (QLOA), China. The wind and dust information were measured simultaneously at different wall-normal heights during the SDS process. The datasets span the period from 17 March to 9 June 2016. The wind speed and direction are recorded by a sonic anemometer with a sampling frequency of 50 Hz, while particulate matter with a diameter of 10 µm or less (PM10) is sampled simultaneously by a dust monitor with a sampling frequency of 1 Hz. The wall-normal array had 11 sonic anemometers and monitors spaced logarithmically from z=0.9 to 30 m, where the spacing is about 2 m between the sonic anemometer and dust monitor at the same height. Based on its nonstationary feature, an SDS event can be divided into three stages, i.e., ascending, stabilizing and descending stages, in which the dynamic mechanism of the wind and dust fields might be different. This is preliminarily characterized by the classical Fourier power analysis. Temporal evolution of the scaling exponent from Fourier power analysis suggests a value slightly below the classical Kolmogorov value of -5/3 for the three-dimensional homogeneous and isotropic turbulence. During the stabilizing stage, the collected PM10 shows a very intermittent pattern, which can be further linked with the burst events in the turbulent atmospheric boundary layer. This dataset is valuable for a better understanding of SDS dynamics and is publicly available in a Zenodo repository at https://doi.org/10.5281/zenodo.5034196 (Li et al., 2021a).


2021 ◽  
Author(s):  
A. Thorseth ◽  
J. Lindén ◽  
C.A. Bouroussis

Temporal light modulation (TLM) and the resulting temporal light artefacts (TLA) can cause problems with health and wellbeing for users of lighting products. Therefore, TLM has to be measured accurately and repeatably. This study investigates important factors influencing the measurement uncertainty of TLM measurements. The study shows how measurement uncertainty on central TLM parameters can have a significant effect on the calculation of TLA. Specifically, we show a linear relationship between DC offset and the expected error. Further we show severe effects of random noise on PstLM on certain waveforms. And lastly, we show a curious effect related to SVM of pulse width modulated signals and the measurement sampling frequency.


2021 ◽  
Vol 8 ◽  
Author(s):  
Philipp Fischer ◽  
Peter Dietrich ◽  
Eric P. Achterberg ◽  
Norbert Anselm ◽  
Holger Brix ◽  
...  

A thorough and reliable assessment of changes in sea surface water temperatures (SSWTs) is essential for understanding the effects of global warming on long-term trends in marine ecosystems and their communities. The first long-term temperature measurements were established almost a century ago, especially in coastal areas, and some of them are still in operation. However, while in earlier times these measurements were done by hand every day, current environmental long-term observation stations (ELTOS) are often fully automated and integrated in cabled underwater observatories (UWOs). With this new technology, year-round measurements became feasible even in remote or difficult to access areas, such as coastal areas of the Arctic Ocean in winter, where measurements were almost impossible just a decade ago. In this context, there is a question over what extent the sampling frequency and accuracy influence results in long-term monitoring approaches. In this paper, we address this with a combination of lab experiments on sensor accuracy and precision and a simulated sampling program with different sampling frequencies based on a continuous water temperature dataset from Svalbard, Arctic, from 2012 to 2017. Our laboratory experiments showed that temperature measurements with 12 different temperature sensor types at different price ranges all provided measurements accurate enough to resolve temperature changes over years on a level discussed in the literature when addressing climate change effects in coastal waters. However, the experiments also revealed that some sensors are more suitable for measuring absolute temperature changes over time, while others are more suitable for determining relative temperature changes. Our simulated sampling program in Svalbard coastal waters over 5 years revealed that the selection of a proper sampling frequency is most relevant for discriminating significant long-term temperature changes from random daily, seasonal, or interannual fluctuations. While hourly and daily sampling could deliver reliable, stable, and comparable results concerning temperature increases over time, weekly sampling was less able to reliably detect overall significant trends. With even lower sampling frequencies (monthly sampling), no significant temperature trend over time could be detected. Although the results were obtained for a specific site, they are transferable to other aquatic research questions and non-polar regions.


Author(s):  
Simon Lykkeboe ◽  
Stine Linding Andersen ◽  
Claus Gyrup Nielsen ◽  
Peter Vestergaard ◽  
Peter Astrup Christensen

Abstract Objectives Indirect data mining methods have been proposed for review of published reference intervals (RIs), but methods for identifying patients with a low likelihood of disease are needed. Many indirect methods extract test results on patients with a low frequency blood sampling history to identify putative healthy individuals. Although it is implied there has been no attempt to validate if patients with a low frequency blood sampling history are healthy and if test results from these patients are suitable for RI review. Methods Danish nationwide health registers were linked with a blood sample database, recording a population of 316,337 adults over a ten-year period. Comorbidity indexes were defined from registrations of hospital diagnoses and redeemed prescriptions of drugs. Test results from patients identified as having a low disease burden were used for review of RIs from the Nordic Reference Interval Project (NORIP). Results Blood sampling frequency correlated with comorbidity Indexes and the proportion of patients without disease conditions were enriched among patients with a low number of blood samples. RIs based on test results from patients with only 1–3 blood samples per decade were for many analytes identical compared to NORIP RIs. Some analytes showed expected incongruences and gave conclusive insights into how well RIs from a more than 10 years old multi-center study (NORIP) performed on current pre-analytical and analytical methods. Conclusions Blood sampling frequency enhance the selection of healthy individuals for reviewing reference intervals, providing a simple method solely based on laboratory data without the addition of clinical information.


Sign in / Sign up

Export Citation Format

Share Document