scholarly journals The method of vicinity minutiae decomposition with higher level graphs for fingerprint verification

2021 ◽  
pp. 92-102
Author(s):  
Sergiy Rassomakhin ◽  
Olha Melkozerova ◽  
Oleksii Nariezhnii

The subject matter of the paper is the development of fingerprint local structures based on the new method of the minutia vicinity decomposition (MVD) for the solution to the task of fingerprint verification. It is an essential task because it is produced attempts to introduce biometric technology in different areas of social and state life: criminology, access control system, mobile device applications, banking. The goal is to develop real number vectors that can respond to criteria for biometric template protection schemes such as irreversibility with the corresponding accuracy of equal error rate (EER). The problem to be solved is the problem of accuracy in the case of verification because there are false minutiae, disappearing of truth minutiae and there are also linear and angular deformations. The method is the new method of MVD that used the level of graphs with many a point from 7 to 3. This scheme of decomposition is shown in this paper; such a variant of decomposition is never used in science articles. The following results were obtained: description of a new method for fingerprint verification. The new metric for creating vectors of real numbers were suggested – a minimal path for points in the graphs. Also, the algorithm for finding out minimal paths for points was proposed in the graphs because the classic algorithm has a problem in some cases with many points being 6. These problems are crossing and excluding arcs are in the path. The way of sorting out such problems was suggested and examples are given for several points are 20. Results of false rejection rate (FRR), false acceptance rate (FAR), EER are shown in the paper. In this paper, the level of EER is 33 % with full search. 78400 false and 1400 true tests were conducted. The method does not use such metrics as distances and angles, which are used in the classical method of MVD and will be used in future papers. This result is shown for total coincidences of real number, not a similarity that it is used at verifications. It is a good result in this case because the result from the method index-of-max is 40 %.

The article presents an example of verification of the fingerprint database by the method of solving the problem of a salesman using the decomposition of the neighborhood of the nearest minutes. The solution of this problem is resistant to linear, angular deformations, mixing of points. This method provides the correct solution for a small number of points, for a large number of points there is a cross section of the contours, the solution is not optimal. Therefore, to reduce the processing time and calculate the metric, a modified algorithm for solving the problem by the method of branches and boundaries, namely the alignment and exclusion of arcs on each cycle of the optimal route. Verification is based on the creation of local structures for each minute of the imprint, because it is the local structures that are resistant to deformation. Building global structures very often does not lead to good quality indicators, as there is a problem with the centering of the entire sample. A complete list of tests of fingerprint database templates during their verification by this method has been carried out. The use of decomposition of characteristic features provides greater stability when adding false and erasing true minutes. The results of the article show the values of pairwise comparisons of two templates for true and false tests. The indicators of false rejection rate (FRR), false access rate (FAR), single equivalent error rate (EER) were studied.


Currently, an attempt is being made to introduce biometric technologies in various spheres of public and state life: forensics, access control systems, applications on mobile devices, banking, etc. The problem of accuracy remains an open question for discussion, because when solving the problem of verification of biometric samples there are problems of addition or disappearance of reference points, deformation of distances between them, linear and angular displacements of the whole sample. Also, the developed biometric systems do not meet all the requirements of information security, namely the integrity, accessibility, authenticity, indisputability, observability and confidentiality. The article presents an analysis of the method of decomposition of minefields during fingerprint verification, describes its advantages and disadvantages in comparison with other methods. It is based on the creation of local structures for each minute of the imprint, because it is the local structures that are resistant to mixing, angular and linear displacement of points. Building global structures often does not lead to good accuracy, as there is a problem of centering the entire sample. A complete list of tests of samples of the database of fingerprints during their verification by this method. An algorithm for constructing a code for an arbitrary minution and an algorithm for comparing two sample templates are described. The results of the article show the value of pairwise comparisons of two templates for true and false tests. The indicators of false rejection rate (FRR), false access rate (FAR), single equivalent error rate (EER) were studied.


Symmetry ◽  
2021 ◽  
Vol 13 (5) ◽  
pp. 910
Author(s):  
Tong-Yuen Chai ◽  
Bok-Min Goi ◽  
Wun-She Yap

Biometric template protection (BTP) schemes are implemented to increase public confidence in biometric systems regarding data privacy and security in recent years. The introduction of BTP has naturally incurred loss of information for security, which leads to performance degradation at the matching stage. Although efforts are shown in the extended work of some iris BTP schemes to improve their recognition performance, there is still a lack of a generalized solution for this problem. In this paper, a trainable approach that requires no further modification on the protected iris biometric templates has been proposed. This approach consists of two strategies to generate a confidence matrix to reduce the performance degradation of iris BTP schemes. The proposed binary confidence matrix showed better performance in noisy iris data, whereas the probability confidence matrix showed better performance in iris databases with better image quality. In addition, our proposed scheme has also taken into consideration the potential effects in recognition performance, which are caused by the database-associated noise masks and the variation in biometric data types produced by different iris BTP schemes. The proposed scheme has reported remarkable improvement in our experiments with various publicly available iris research databases being tested.


2014 ◽  
Vol 2014 ◽  
pp. 1-8 ◽  
Author(s):  
Hailun Liu ◽  
Dongmei Sun ◽  
Ke Xiong ◽  
Zhengding Qiu

Fuzzy vault scheme (FVS) is one of the most popular biometric cryptosystems for biometric template protection. However, error correcting code (ECC) proposed in FVS is not appropriate to deal with real-valued biometric intraclass variances. In this paper, we propose a multidimensional fuzzy vault scheme (MDFVS) in which a general subspace error-tolerant mechanism is designed and embedded into FVS to handle intraclass variances. Palmprint is one of the most important biometrics; to protect palmprint templates; a palmprint based MDFVS implementation is also presented. Experimental results show that the proposed scheme not only can deal with intraclass variances effectively but also could maintain the accuracy and meanwhile enhance security.


2020 ◽  
Vol 28 (4) ◽  
pp. 247-252
Author(s):  
Alexander Lozhkin ◽  
Pavol Bozek ◽  
Konstantin Maiorov

AbstractThe geometric model accuracy is crucial for product design. More complex surfaces are represented by the approximation methods. On the contrary, the approximation methods reduce the design quality. A new alternative calculation method is proposed. The new method can calculate both conical sections and more complex curves. The researcher is able to get an analytical solution and not a sequence of points with the destruction of the object semantics. The new method is based on permutation and other symmetries and should have an origin in the internal properties of the space. The classical method consists of finding transformation parameters for symmetrical conic profiles, however a new procedure for parameters of linear transformations determination was acquired by another method. The main steps of the new method are theoretically presented in the paper. Since a double result is obtained in most stages, the new calculation method is easy to verify. Geometric modeling in the AutoCAD environment is shown briefly. The new calculation method can be used for most complex curves and linear transformations. Theoretical and practical researches are required additionally.


2019 ◽  
Vol 63 (3) ◽  
pp. 479-493 ◽  
Author(s):  
Wadood Abdul ◽  
Ohoud Nafea ◽  
Sanaa Ghouzali

AbstractThere are a number of issues related to the development of biometric authentication systems, such as privacy breach, consequential security and biometric template storage. Thus, the current paper aims to address these issues through the hybrid approach of watermarking with biometric encryption. A multimodal biometric template protection approach with fusion at score level using fingerprint and face templates is proposed. The proposed approach includes two basic stages, enrollment stage and verification stage. During the enrollment stage, discrete wavelet transform (DWT) is applied on the face images to embed the fingerprint features into different directional sub-bands. Watermark embedding and extraction are done by quantizing the mean values of the wavelet coefficients. Subsequently, the inverse DWT is applied to obtain the watermarked image. Following this, a unique token is assigned for each genuine user and a hyper-chaotic map is used to produce a key stream in order to encrypt a watermarked image using block-cipher. The experimentation results indicate the efficiency of the proposed approach in term of achieving a reasonable error rate of 3.87%.


Sign in / Sign up

Export Citation Format

Share Document