random source
Recently Published Documents


TOTAL DOCUMENTS

76
(FIVE YEARS 19)

H-INDEX

13
(FIVE YEARS 2)

Entropy ◽  
2021 ◽  
Vol 23 (9) ◽  
pp. 1182
Author(s):  
Maciej Stankiewicz ◽  
Karol Horodecki ◽  
Omer Sakarya ◽  
Danuta Makowiec

We investigate whether the heart rate can be treated as a semi-random source with the aim of amplification by quantum devices. We use a semi-random source model called ε-Santha–Vazirani source, which can be amplified via quantum protocols to obtain a fully private random sequence. We analyze time intervals between consecutive heartbeats obtained from Holter electrocardiogram (ECG) recordings of people of different sex and age. We propose several transformations of the original time series into binary sequences. We have performed different statistical randomness tests and estimated quality parameters. We find that the heart can be treated as a good enough, and private by its nature, source of randomness that every human possesses. As such, in principle, it can be used as input to quantum device-independent randomness amplification protocols. The properly interpreted ε parameter can potentially serve as a new characteristic of the human heart from the perspective of medicine.


2021 ◽  
Author(s):  
Rodrigo Cifuentes-Lobos ◽  
Ignacia Calisto ◽  
Cristian Saavedra ◽  
Franchesca Ormeño ◽  
Javiera San Martín ◽  
...  

<p><br>Probabilistic Tsunami Hazard Assessment (PTHA) brings a variety of mathematical and numerical tools for evaluating long-term exposure to tsunami related hazards in coastal communities, within which the logic tree method stands out for its usefulness in generating random slip models and dealing with epistemic and aleatory uncertainties. Key items for the stochastic study of tsunami scenarios. This method, by combining parameters that define a source model (such as magnitude, and rupture limits), allows to create a vast number of random source models that, as well as they can be used for assessing future and long-term hazard, they can also be used in conjunction with data and observations obtained from past tsunamis and earthquakes in their study.<br><br>This study proposes a numerical methodology for the generation of random tsunami source models, based on the logic tree method, for studying paleo tsunamis and historical tsunamis. In this case this methodology will be tested with data from the great Valdivia 1960 9.5 Mw earthquake and tsunami. These random source models are then filtered using empirical relations between magnitudes and rupture dimensions or rupture aspect ratios. Those models that pass this filter are then used to compute deformation using the Okada, 1985 method. This deformation fields are filtered using geodetic data and observations associated with the event of interest, eliminating all models that doesn’t satisfy these observations. In contrast, all models that pass this filter, are used as inputs to model tsunami using a staggered scheme, first modelling with low resolution topobathymetry grids, in order to assess if tsunami waves are registered in locations that are known to have been inundated and eliminate the models that do not show this behaviour. And secondly, using the deformation models that satisfy this past filter as input, high resolution grids are used to model tsunami and appraise the estimated run up of inundations and compare it with reliable historical accounts and sedimentological observations. Those models that pass all the filters mentioned above, will be subjects to statistical analysis to compare them with existent models of the Valdivia 1960 earthquake.<br><br>As it was stated above, and due to the important number of published studies, data and historical accounts, and source models available, the Valdivia 1960 9.5 Mw earthquake will be used as a benchmark to test this methodology, in order to appraise the convergence of the random models that pass every filter to the existent source models. It is important to further specify that this methodology was designed to study historical and paleo tsunamis, and will only be tested with modern tsunamis, such as Valdivia 1960.</p>


2021 ◽  
Vol 26 (1) ◽  
pp. 13-24
Author(s):  
Shenglin Li ◽  
Pingsong Zhang ◽  
Chaoqiang Xi

The boom-type roadheader is the main equipment for realizing the mechanization of coal drifting in coal mines, and it is an indispensable production equipment in major coal-producing countries. Substantial vibrations are generated during the operation of a roadheader; these vibrations carry substantial energy and, thus, can be regarded as a potential source and used for seismic advance detection purposes in mine drifts. Compared with a conventional exploration source, a roadheader source produces a complex continuous random signal. The shape of a seismic wavelet is uncertain and its duration is relatively long; thus, it must be processed into a conventional pulse signal before it can be used for subsequent processing and analysis. Therefore, based on the advantages of seismic interferometry in random signal processing, two seismic interference techniques, namely, deconvolution and cross-correlation, are introduced for constructing a compound interference algorithm. On the basis of a theoretically derived formula, a random signal impulse processing experiment is conducted using field-acquired source signals from a roadheader; this approach resolves the problem that cross-correlation alone cannot yield ideal results. Hence, a feasible algorithm for the impulse processing of a random signal, namely, the compound interference algorithm, is proposed. The algorithm deconvolves each seismic trace to obtain the reference trace and other receiver traces after compressing the wavelet. Then, the reference trace and each receiver trace are cross-correlated, and the wavelet time delay information of each correlated wavelet pulse, namely, the wavelet time delay information of the receiver trace relative to the reference trace, is obtained. Accordingly, the direct wave and reflected waves are recognized. To evaluate the performance of the algorithm, an algorithm application experiment is conducted for another group of random source signals that were collected by a roadheader under different coal drift conditions. Again, the algorithm processing results are consistent with the single-shot record characteristics of an explosive source. Consequently, the proposed algorithm can satisfy the requirements for engineering exploration and analysis. A comprehensive analysis further demonstrates that the compound interference algorithm is both feasible and effective and that the processed seismic signals can be used for subsequent processing and interpretation.


2020 ◽  
Vol 63 (12) ◽  
pp. 1826-1834
Author(s):  
Yiming Li ◽  
Shengli Liu ◽  
Dawu Gu ◽  
Kefei Chen

Abstract A fuzzy extractor derives uniformly random strings from noisy sources that are neither reliably reproducible nor uniformly random. The basic definition of fuzzy extractor was first formally introduced by Dodis et al. and has achieved various applications in cryptographic systems. However, it has been proved that a fuzzy extractor could become totally insecure when the same noisy random source is extracted multiple times. To solve this problem, the reusable fuzzy extractor is proposed. In this paper, we propose the first reusable fuzzy extractor based on the LPN assumption, which is efficient and resilient to linear fraction of errors. Furthermore, our construction serves as an alternative post-quantum reusable fuzzy extractor.


Sign in / Sign up

Export Citation Format

Share Document