error rate
Recently Published Documents





2022 ◽  
Vol 16 (1) ◽  
pp. 1-62
Nampoina Andriamilanto ◽  
Tristan Allard ◽  
Gaëtan Le Guelvouit ◽  
Alexandre Garel

Modern browsers give access to several attributes that can be collected to form a browser fingerprint. Although browser fingerprints have primarily been studied as a web tracking tool, they can contribute to improve the current state of web security by augmenting web authentication mechanisms. In this article, we investigate the adequacy of browser fingerprints for web authentication. We make the link between the digital fingerprints that distinguish browsers, and the biological fingerprints that distinguish Humans, to evaluate browser fingerprints according to properties inspired by biometric authentication factors. These properties include their distinctiveness, their stability through time, their collection time, their size, and the accuracy of a simple verification mechanism. We assess these properties on a large-scale dataset of 4,145,408 fingerprints composed of 216 attributes and collected from 1,989,365 browsers. We show that, by time-partitioning our dataset, more than 81.3% of our fingerprints are shared by a single browser. Although browser fingerprints are known to evolve, an average of 91% of the attributes of our fingerprints stay identical between two observations, even when separated by nearly six months. About their performance, we show that our fingerprints weigh a dozen of kilobytes and take a few seconds to collect. Finally, by processing a simple verification mechanism, we show that it achieves an equal error rate of 0.61%. We enrich our results with the analysis of the correlation between the attributes and their contribution to the evaluated properties. We conclude that our browser fingerprints carry the promise to strengthen web authentication mechanisms.

Hasan Aldiabat ◽  
Nedal Al-ababneh

In this paper, the bandwidth density of misaligned free space optical interconnects (FSOIs) system with and without coding under a fixed bit error rate is considered. In particular, we study the effect of using error correction codes of various codeword lengths on the bandwidth density and misalignment tolerance of the FSOIs system in the presence of higher order modes. Moreover, the paper demonstrates the use of the fill factor of the detector array as a design parameter to optimize the bandwidth density of the communication. The numerical results demonstrate that the bandwidth density improves significantly with coding and the improvement is highly dependent on the used codeword length and code rate. In addition, the results clearly show the optimum fill factor values that achieve the maximum bandwidth density and misalignment tolerance of the system.

Vo Trung Dung Huynh ◽  
Linh Mai ◽  
Hung Ngoc Do ◽  
Minh Ngoc Truong Nguyen ◽  
Trung Kien Pham

<span>High-speed Terahertz communication systems has recently employed orthogonal frequency division multiplexing approach as it provides high spectral efficiency and avoids inter-symbol interference caused by dispersive channels. Such high-speed systems require extremely high-sampling <br /> time-interleaved analog-to-digital converters at the receiver. However, timing mismatch of time-interleaved analog-to-digital converters significantly causes system performance degradation. In this paper, to avoid such performance degradation induced by timing mismatch, we theoretically determine maximum tolerable mismatch levels for orthogonal frequency division multiplexing communication systems. To obtain these levels, we first propose an analytical method to derive the bit error rate formula for quadrature and pulse amplitude modulations in Rayleigh fading channels, assuming binary reflected gray code (BRGC) mapping. Further, from the derived bit error rate (BER) expressions, we reveal a threshold of timing mismatch level for which error floors produced by the mismatch will be smaller than a given BER. Simulation results demonstrate that if we preserve mismatch level smaller than 25% of this obtained threshold, the BER performance degradation is smaller than 0.5 dB as compared to the case without timing mismatch.</span>

Hamza Abbad ◽  
Shengwu Xiong

Automatic diacritization is an Arabic natural language processing topic based on the sequence labeling task where the labels are the diacritics and the letters are the sequence elements. A letter can have from zero up to two diacritics. The dataset used was a subset of the preprocessed version of the Tashkeela corpus. We developed a deep learning model composed of a stack of four bidirectional long short-term memory hidden layers of the same size and an output layer at every level. The levels correspond to the groups that we classified the diacritics into (short vowels, double case-endings, Shadda, and Sukoon). Before training, the data were divided into input vectors containing letter indexes and outputs vectors containing the indexes of diacritics regarding their groups. Both input and output vectors are concatenated, then a sliding window operation with overlapping is performed to generate continuous and fixed-size data. Such data is used for both training and evaluation. Finally, we realize some tests using the standard metrics with all of their variations and compare our results with two recent state-of-the-art works. Our model achieved 3% diacritization error rate and 8.99% word error rate when including all letters. We have also generated the confusion matrix to show the performances per output and analyzed the mismatches of the first 500 lines to classify the model errors according to their linguistic nature.

2022 ◽  
Vol 35 (2) ◽  
pp. 025017
Quentin Herr ◽  
Alex Braun ◽  
Andrew Brownfield ◽  
Ed Rudman ◽  
Dan Dosch ◽  

Abstract A circuit-simulation-based method is used to determine the thermally-induced bit error rate of superconducting Single Flux Quantum logic circuits. Simulations are used to evaluate the multidimensional Gaussian integral across noise current sources attached to the active devices. The method is data-assisted and has predictive power. Measurement determines the value of a single parameter, effective noise bandwidth, for each error mechanism. The errors in the distributed networks of comparator-free Reciprocal Quantum Logic nucleate across multiple Josephson junctions, so the effective critical current is about three times that of the individual devices. The effective noise bandwidth is only 6%–23% of the junction plasma frequency at a modest clock rate of 3.4 GHz, which is 1% of the plasma frequency. This analysis shows the ways measured bit error rate comes out so much lower than simplistic estimates based on isolated devices.

2022 ◽  
Vol 12 (1) ◽  
Vani Rajasekar ◽  
Bratislav Predić ◽  
Muzafer Saracevic ◽  
Mohamed Elhoseny ◽  
Darjan Karabasevic ◽  

AbstractBiometric security is a major emerging concern in the field of data security. In recent years, research initiatives in the field of biometrics have grown at an exponential rate. The multimodal biometric technique with enhanced accuracy and recognition rate for smart cities is still a challenging issue. This paper proposes an enhanced multimodal biometric technique for a smart city that is based on score-level fusion. Specifically, the proposed approach provides a solution to the existing challenges by providing a multimodal fusion technique with an optimized fuzzy genetic algorithm providing enhanced performance. Experiments with different biometric environments reveal significant improvements over existing strategies. The result analysis shows that the proposed approach provides better performance in terms of the false acceptance rate, false rejection rate, equal error rate, precision, recall, and accuracy. The proposed scheme provides a higher accuracy rate of 99.88% and a lower equal error rate of 0.18%. The vital part of this approach is the inclusion of a fuzzy strategy with soft computing techniques known as an optimized fuzzy genetic algorithm.

2022 ◽  
Vol 13 ◽  
Yoshihiro Itaguchi ◽  
Susana A. Castro-Chavira ◽  
Knut Waterloo ◽  
Stein Harald Johnsen ◽  
Claudia Rodríguez-Aranda

Semantic verbal fluency (VF), assessed by animal category, is a task widely used for early detection of dementia. A feature not regularly assessed is the occurrence of errors such as perseverations and intrusions. So far, no investigation has analyzed the how and when of error occurrence during semantic VF in aging populations, together with their possible neural correlates. The present study aims to address the issue using a combined methodology based on latent Dirichlet allocation (LDA) analysis for word classification together with a time-course analysis identifying exact time of errors’ occurrence. LDA is a modeling technique that discloses hidden semantic structures based on a given corpus of documents. We evaluated a sample of 66 participants divided into a healthy young group (n = 24), healthy older adult group (n = 23), and group of patients with mild Alzheimer’s disease (AD) (n = 19). We performed DTI analyses to evaluate the white matter integrity of three frontal tracts purportedly underlying error commission: anterior thalamic radiation, frontal aslant tract, and uncinate fasciculus. Contrasts of DTI metrics were performed on the older groups who were further classified into high-error rate and low-error rate subgroups. Results demonstrated a unique deployment of error commission in the patient group characterized by high incidence of intrusions in the first 15 s and higher rate of perseverations toward the end of the trial. Healthy groups predominantly showed very low incidence of perseverations. The DTI analyses revealed that the patients with AD committing high-error rate presented significantly more degenerated frontal tracts in the left hemisphere. Thus, our findings demonstrated that the appearance of intrusions, together with left hemisphere degeneration of frontal tracts, is a pathognomic trait of mild AD. Furthermore, our data suggest that the error commission of patients with AD arises from executive and working memory impairments related partly to deteriorated left frontal tracts.

2022 ◽  
Vol 14 (2) ◽  
pp. 790
Jaroslava Kubáňová ◽  
Iveta Kubasáková ◽  
Kristián Čulík ◽  
Lukáš Štítik

The article focuses on expanding the use of barcodes in selected logistics activities in a company. Our study discusses the application of barcode technology to selected logistics activities in the company in order to address the error rate in these activities and to control ownership of this technology in other logistics activities within the company during the COVID-19. The priority of the testing phase was to point out the elimination of errors in the original versus the newly proposed solution for the company on 10 products. In the test phase, the 10 products with the highest turnover in the company were used to point out the elimination of errors in various logistics activities, especially the time saved compared to the work of human personnel in the company. The company has this technology at its disposal, in the parent company as well as in the subsidiary. It was only a matter of expanding the use and applicability of this technology as well as other possibilities for research hypotheses, which we outlined at the end of the article. In this article, we focus on RFID and barcode technologies, since the company initially considered using RFID technology, however, chose the use of barcodes because it was an already known work technology. The current situation affected with COVID-19 disease requires many advantages and disadvantages of both technologies.

Trials ◽  
2022 ◽  
Vol 23 (1) ◽  
Richard A. Parker ◽  
Christopher J. Weir

AbstractAnalysis of multiple secondary outcomes in a clinical trial leads to an increased probability of at least one false significant result among all secondary outcomes studied. In this paper, we question the notion that that if no multiplicity adjustment has been applied to multiple secondary outcome analyses in a clinical trial, then they must necessarily be regarded as exploratory. Instead, we argue that if individual secondary outcome results are interpreted carefully and precisely, there is no need to downgrade our interpretation to exploratory. This is because the probability of a false significant result for each comparison, the per-comparison wise error rate, does not increase with multiple testing. Strong effects on secondary outcomes should always be taken seriously and must not be dismissed purely on the basis of multiplicity concerns.

2022 ◽  
Vol 2022 ◽  
pp. 1-15
Chengdong Zhu ◽  
Ruizhi Shao ◽  
Xinmiao Zhang ◽  
Shan Gao ◽  
Bowen Li

In recent years, with the rise of virtual and sports, people are often injured due to irregular movements during exercise. Based on this, this article takes the basic concept of computer virtual reality as the starting point and analyzes the adoption of computer virtual reality in sports correction. The choice of corrective exercises depends on the daily wrong exercises. If there is an error in any operation, please select the corresponding operation mode for the error operation for corrective training. If there are multiple errors at the same time, we compare and select the error operation that needs to be resolved first and the error operation that needs to be resolved later. In our daily life, the squat action pattern of Tai Chi Half-squat stance ball is closely related to us, and its action pattern can fully reflect the core stability of the subject. In this study, 42 students were selected as the students of a college sports college in Dalian. There are 7 people in each group, divided into 6 groups. The first three groups used exercise correction to assist the traditional teaching method (experimental group). It was found that the performance error rate of the first three groups reached 65%, and the last three groups used the assisted virtual reality teaching method, and the error rate was only 4%. Therefore, we adopted the teaching of virtual reality can reduce the error rate of the movement posture. We trained them for half a year and assessed and scored them once a month. We collect this data and analyze it to reach a conclusion. The experimental results prove that the experimental group has a significant difference before and after the squat test ( P < 0.05 ), and postexperiment there is a remarkable disparity among the experimental group and the comparison group ( P < 0.05 ). Therefore, squatting correction training is effective in improving the wrong squat. It is very effective for problems such as posture movement mode and joint limitation. Therefore, it is a very meaningful thing to explore the exercise correction training based on the virtual reality of computer vision.

Sign in / Sign up

Export Citation Format

Share Document