correlation process
Recently Published Documents


TOTAL DOCUMENTS

53
(FIVE YEARS 11)

H-INDEX

6
(FIVE YEARS 2)

Heat Transfer ◽  
2021 ◽  
Author(s):  
Chaitanya D. Moholkar ◽  
Shivam V. Vala ◽  
Channamallikarjun S. Mathpati ◽  
Aniruddha J. Joshi ◽  
Vivek S. Vitankar ◽  
...  

Sensors ◽  
2021 ◽  
Vol 21 (21) ◽  
pp. 7128
Author(s):  
Kazimierz Krosman ◽  
Janusz Sosnowski

In many embedded systems, we face the problem of correlating signals characterising device operation (e.g., performance parameters, anomalies) with events describing internal device activities. This leads to the investigation of two types of data: time series, representing signal periodic samples in a background of noise, and sporadic event logs. The correlation process must take into account clock inconsistencies between the data acquisition and monitored devices, which provide time series signals and event logs, respectively. The idea of the presented solution is to classify event logs based on the introduced similarity metric and deriving their distribution in time. The identified event log sequences are matched with time intervals corresponding to specified sample patterns (objects) in the registered signal time series. The matching (correlation) process involves iterative time offset adjustment. The paper presents original algorithms to investigate correlation problems using the object-oriented data models corresponding to two monitoring sources. The effectiveness of this approach has been verified in power consumption analysis using real data collected from the developed Holter device. It is quite universal and can be easily adapted to other device optimisation problems.


2021 ◽  
Vol 154 ◽  
pp. 108115
Author(s):  
Yongwei Chen ◽  
Yongjing Xie ◽  
Xinxing Zhou ◽  
Yonggang Li

2020 ◽  
Vol 223 (3) ◽  
pp. 1548-1564
Author(s):  
Andreas Fichtner ◽  
Daniel Bowden ◽  
Laura Ermert

SUMMARY A wide spectrum of processing schemes is commonly applied during the calculation of seismic noise correlations. This is intended to suppress large-amplitude transient and monochromatic signals, to accelerate convergence of the correlation process or to modify raw correlations into more plausible approximations of interstation Green’s functions. Many processing schemes, such as one-bit normalization or various other nonlinear normalizations, clearly break the linear physics of seismic wave propagation. This naturally raises the question: To what extent are the resulting noise correlations physically meaningful quantities? In this contribution, we demonstrate that commonly applied processing methods may indeed introduce an unphysical component into noise correlations. This affects not only noise correlation amplitudes but also, to a lesser extent, time-dependent phase information. The profound consequences are that most processed correlations cannot be entirely explained by any combination of Earth structure and noise sources, and that inversion results may thus be polluted. The positive component of our analysis is a new and easily applicable method that allows us to modify any existing processing such that it becomes optimal in the sense of (1) completely avoiding the unphysical component while (2) approximating the result of the original processing as closely as possible. The resulting optimal schemes can be derived purely on the basis of observed noise, without any knowledge of or assumptions on the nature of noise sources. In addition to the theoretical analysis, we present illustrative real-data examples from the Irish National Seismic Network and the Lost Hills array in Central California. We anticipate that optimal processing schemes may be most useful in applications that exploit complete correlation waveforms, amplitudes and weak arrivals, or small (time-dependent) phase shifts.


2020 ◽  
Author(s):  
Jie E. Yang ◽  
Matthew R. Larson ◽  
Bryan S. Sibert ◽  
Samantha Shrum ◽  
Elizabeth R. Wright

AbstractCryo-correlative light and electron microscopy (CLEM) is a technique that uses the spatiotemporal cues from fluorescence light microscopy (FLM) to investigate the high-resolution ultrastructure of biological samples by cryo-electron microscopy (cryo-EM). Cryo-CLEM provides advantages for identifying and distinguishing fluorescently labeled proteins, macromolecular complexes, and organelles from the cellular environment. Challenges remain on how correlation workflows and software tools are implemented on different microscope platforms to support microscopy-driven structural studies. Here, we present an open-source desktop application tool, CorRelator, to bridge between cryo-FLM and cryo-EM/ET data collection instruments. CorRelator was designed to be flexible for both on-the-fly and post-acquisition correlation schemes. The CorRelator workflow is easily adapted to any fluorescence and transmission electron microscope (TEM) system configuration. CorRelator was benchmarked under cryogenic and ambient temperature conditions using several FLM and TEM instruments, demonstrating that CorRelator is a rapid and efficient application for image and position registration in CLEM studies. CorRelator is a cross-platform software featuring an intuitive Graphical User Interface (GUI) that guides the user through the correlation process. CorRelator source code is available at: https://github.com/wright-cemrc-projects/corr.


2020 ◽  
pp. 135481662092262
Author(s):  
Naji Jalkh ◽  
Elie Bouri ◽  
Xuan Vinh Vo ◽  
Anupam Dutta

Unlike previous studies, we examine which of the implied volatilities of US stock and crude oil markets are more suitable and effective hedge for the downside risk of US travel and leisure (T&L) stocks. Using the corrected dynamic conditional correlation process, the results show that the T&L stock index is more negatively and more consistently correlated with the implied volatility of crude oil prices, suggesting that the oil implied volatility is a more suitable hedging asset. Similar results are reported for France, the United Kingdom, and developed markets. They are robust to the frequency of the data and model specification. Furthermore, the hedge ratios vary over time, which requires a regular update of hedged positions. Importantly, the highest hedge effectiveness is associated with the oil implied volatility.


2020 ◽  
Author(s):  
Andreas Fichtner ◽  
Daniel Bowden ◽  
Laura Ermert

<p>A wide spectrum of processing schemes is commonly applied during the calculation of seismic noise correlations. This is intended to suppress large-amplitude transient and monochromatic signals, to accelerate convergence of the correlation process, or to modify raw correlations into more plausible approximations of inter-station Green's functions. Many processing schemes, such as one-bit normalisation or various non-linear normalizations, clearly break the linear physics of seismic wave propagation. This naturally raises the question: To what extent are the resulting noise correlations physically meaningful quantities?</p><p>In this contribution, we rigorously demonstrate that most commonly applied processing methods introduce an unphysical component into noise correlations. This affects noise correlation amplitudes but also, to a lesser extent, time-dependent phase information. The profound consequences are that most processed correlations cannot be entirely explained by any combination of Earth structure and noise sources, and that inversion results may thus be polluted.</p><p>The positive component of our analysis is a new class of processing schemes that are optimal in the sense of (1) completely avoiding the unphysical component, while (2) closely approximating the desirable effects of conventional processing schemes. The optimal schemes can be derived purely on the basis of observed noise, without any knowledge of or assumptions on the nature of noise sources.</p><p>In addition to the theoretical analysis, we present illustrative real-data examples from the Irish National Seismic Network and the Lost Hills array in Central California. This includes a quantification of potential artifacts that arise when mapping unphysical traveltime and amplitude variations into images of seismic velocities or attenuation.</p>


Author(s):  
Rasha O. Mahmoud ◽  
Mazen M. Selim ◽  
Omar A. Muhi

In the present study, a multimodal biometric authentication method is presented to confirm the identity of a person based on his face and iris features. This method depends on multiple biometric techniques that combine face and iris (left and right) features to recognize. The authors have designed and applied a system to identify people. It depends on extracting the features of the face using Rectangle Histogram of Oriented Gradient (R-HOG). The study applies a feature-level fusion using a novel fusion method which employs both the canonical correlation process and the proposed serial concatenation. A deep belief network was used for the recognition process. The performance of the proposed systems was validated and evaluated through a set of experiments on SDUMLA-HMT database. The results were compared with others, and have shown that the fusion time has been reduced by about 34.5%. The proposed system has also succeeded in achieving a lower equal error rate (EER), and a recognition accuracy up to 99%.


2019 ◽  
Vol 9 (1) ◽  
Author(s):  
Xiaofeng Fu ◽  
Jiying Ning ◽  
Zhou Zhong ◽  
Zandrea Ambrose ◽  
Simon Charles Watkins ◽  
...  

AbstractCorrelative light and electron microscopy (CLEM) combines the strengths of both light and electron imaging modalities and enables linking of biological spatiotemporal information from live-cell fluorescence light microscopy (fLM) to high-resolution cellular ultra-structures from cryo-electron microscopy and tomography (cryoEM/ET). This has been previously achieved by using fLM signals to localize the regions of interest under cryogenic conditions. The correlation process, however, is often tedious and time-consuming with low throughput and limited accuracy, because multiple correlation steps at different length scales are largely carried out manually. Here, we present an experimental workflow, AutoCLEM, which overcomes the existing limitations and improves the performance and throughput of CLEM methods, and associated software. The AutoCLEM system encompasses a high-speed confocal live-cell imaging module to acquire an automated fLM grid atlas that is linked to the cryoEM grid atlas, followed by cryofLM imaging after freezing. The fLM coordinates of the targeted areas are automatically converted to cryoEM/ET and refined using fluorescent fiducial beads. This AutoCLEM workflow significantly accelerates the correlation efficiency between live-cell fluorescence imaging and cryoEM/ET structural analysis, as demonstrated by visualizing human immunodeficiency virus type 1 (HIV-1) interacting with host cells.


Sign in / Sign up

Export Citation Format

Share Document