thinning algorithm
Recently Published Documents


TOTAL DOCUMENTS

288
(FIVE YEARS 18)

H-INDEX

32
(FIVE YEARS 1)

2021 ◽  
Vol 20 (3) ◽  
pp. 15-25
Author(s):  
Saifullahi Sadi Shitu ◽  
Syed Abd Rahman Syed Abu Bakar ◽  
Nura Musa Tahir ◽  
Usman Isyaku Bature ◽  
Haliru Liman

The thinning algorithm is one of the approaches of identifying each character printed on the car plate. Malaysian car plate characters appear in different character sizes, styles, customized printed characters etc. These variations contribute to difficulty in thinning successfully segmented and extracted license plate characters for recognition. To address these problems, an improved thinning operation for Malaysian car plate character recognition is proposed. In this algorithm, samples from segmented and extracted license plates are used for a thinning operation which is passed to Zhang-Suen thinning algorithm that could not guarantee one pixel thick and then to single pixelate algorithm that provides one pixel width of character for recognition. From the simulation, the result obtained has clearly proven to be the best for character recognition systems with least number of white pixels (777 pixels) and 0.26% redundant pixel left in the medial curve.    


Author(s):  
A Sathesh ◽  
Edriss Eisa Babikir Adam

Image thinning is the most essential pre-processing technique that plays major role in image processing applications such as image analysis and pattern recognition. It is a process that reduces a thick binary image into thin skeleton. In the present paper we have used hybrid parallel thinning algorithm to obtain the skeleton of the binary image. The result skeleton contains one pixel width which preserves the topological properties and retains the connectivity.


2021 ◽  
Author(s):  
Gábor Karai ◽  
Péter Kardos

Strand proposed a distance-based thinning algorithm for computing surface skeletons on the body-centered cubic (BCC) grid. In this paper, we present two modified versions of this algorithm that are faster than the original one, and less sensitive to the visiting order of points in the sequential thinning phase. In addition, a novel algorithm capable of producing curve skeletons is also reported.


Author(s):  
Mariya Nazarkevych ◽  
Serhii Dmytruk ◽  
Volodymyr Hrytsyk ◽  
Olha Vozna ◽  
Anzhela Kuza ◽  
...  

Background: Systems of the Internet of Things are actively implementing biometric systems. For fast and high-quality recognition in sensory biometric control and management systems, skeletonization methods are used at the stage of fingerprint recognition. The analysis of the known skeletonization methods of Zhang-Suen, Hilditch, Ateb-Gabor with the wave skeletonization method has been carried out and it shows a good time and qualitative recognition results. Methods: The methods of Zhang-Suen, Hildich and thinning algorithm based on Ateb-Gabor filtration, which form the skeletons of biometric fingerprint images, are considered. The proposed thinning algorithm based on Ateb-Gabor filtration showed better efficiency because it is based on the best type of filtering, which is both a combination of the classic Gabor function and the harmonic Ateb function. The combination of this type of filtration makes it possible to more accurately form the surroundings where the skeleton is formed. Results: Along with the known ones, a new Ateb-Gabor filtering algorithm with the wave skeletonization method has been developed, the recognition results of which have better quality, which allows to increase the recognition quality from 3 to 10%. Conclusion: The Zhang-Suen algorithm is a 2-way algorithm, so for each iteration, it performs two sets of checks during which pixels are removed from the image. Zhang-Suen's algorithm works on a plot of black pixels with eight neighbors. This means that the pixels found along the edges of the image are not analyzed. Hilditch thinning algorithm occurs in several passages, where the algorithm checks all pixels and decides whether to replace a pixel from black to white if certain conditions are satisfied. This Ateb-Gabor filtering will provide better performance, as it allows to obtain more hollow shapes, organize a larger range of curves. Numerous experimental studies confirm the effectiveness of the proposed method.


Electronics ◽  
2020 ◽  
Vol 9 (4) ◽  
pp. 555
Author(s):  
Rongchun Hu ◽  
Zhenming Peng ◽  
Juan Ma ◽  
Wei Li

The contour thinning algorithm is an imaging algorithm for circular synthetic aperture radar (SAR) that can obtain clear target contours and has been successfully used for circular SAR (CSAR) target recognition. However, the contour thinning imaging algorithm loses some details when thinning the contour, which needs to be improved. This paper presents an improved contour thinning imaging algorithm based on residual compensation. In this algorithm, the residual image is obtained by subtracting the contour thinning image from the traditional backprojection image. Then, the compensation information is extracted from the residual image by repeatedly using the gravitation-based speckle reduction algorithm. Finally, the extracted compensation image is superimposed on the contour thinning image to obtain a compensated contour thinning image. The proposed algorithm is demonstrated on the Gotcha dataset. The convolutional neural network (CNN) is used to recognize the target image. The experimental results show that the image after compensation has a higher target recognition accuracy than the image before compensation.


2020 ◽  
Vol 8 (4) ◽  
pp. 393
Author(s):  
I Made Pegi Kurnia Amerta ◽  
I Gede Arta Wibawa

The game of writing letters is an attractive learning media. Each person's handwriting is different. So that it requires a data classification method to match the test data with the template that is the alphabet letter. In this journal using a template matching cross-correlation for data classification. Before data classification, preprocessing is done in the form of resize and threshold to produce binary images. Thinning process is also carried out to thin the letters. The thinning algorithm used is stentiford. From the accuracy testing obtained an average value of 70.38%. With the number of letters that continue to experience errors namely the characters H, K, M, and Y.


Sign in / Sign up

Export Citation Format

Share Document