Effective offline handwritten text recognition model based on a sequence-to-sequence approach with CNN–RNN networks

Author(s):  
R. Geetha ◽  
T. Thilagam ◽  
T. Padmavathy
2021 ◽  
pp. 1-12
Author(s):  
Fei Long

The difficulty of English text recognition lies in fuzzy image text classification and part-of-speech classification. Traditional models have a high error rate in English text recognition. In order to improve the effect of English text recognition, guided by machine learning ideas, this paper combines ant colony algorithm and genetic algorithm to construct an English text recognition model based on machine learning. Moreover, based on the characteristics of ant colony intelligent algorithm optimization, a method of using ant colony algorithm to solve the central node is proposed. In addition, this paper uses the ant colony algorithm to obtain the characteristic points in the study area and determine a reasonable number, and then combine the uniform grid to select some non-characteristic points as the central node of the core function, and finally use the central node with a reasonable distribution for modeling. Finally, this paper designs experiments to verify the performance of the model constructed in this paper and combines mathematical statistics to visually display the experimental results using tables and graphs. The research results show that the performance of the model constructed in this paper is good.


Author(s):  
Sri. Yugandhar Manchala ◽  
Jayaram Kinthali ◽  
Kowshik Kotha ◽  
Kanithi Santosh Kumar, Jagilinki Jayalaxmi ◽  

2021 ◽  
Author(s):  
Ayan Kumar Bhunia ◽  
Shuvozit Ghose ◽  
Amandeep Kumar ◽  
Pinaki Nath Chowdhury ◽  
Aneeshan Sain ◽  
...  

2020 ◽  
Vol 6 (12) ◽  
pp. 141
Author(s):  
Abdelrahman Abdallah ◽  
Mohamed Hamada ◽  
Daniyar Nurseitov

This article considers the task of handwritten text recognition using attention-based encoder–decoder networks trained in the Kazakh and Russian languages. We have developed a novel deep neural network model based on a fully gated CNN, supported by multiple bidirectional gated recurrent unit (BGRU) and attention mechanisms to manipulate sophisticated features that achieve 0.045 Character Error Rate (CER), 0.192 Word Error Rate (WER), and 0.253 Sequence Error Rate (SER) for the first test dataset and 0.064 CER, 0.24 WER and 0.361 SER for the second test dataset. Our proposed model is the first work to handle handwriting recognition models in Kazakh and Russian languages. Our results confirm the importance of our proposed Attention-Gated-CNN-BGRU approach for training handwriting text recognition and indicate that it can lead to statistically significant improvements (p-value < 0.05) in the sensitivity (recall) over the tests dataset. The proposed method’s performance was evaluated using handwritten text databases of three languages: English, Russian, and Kazakh. It demonstrates better results on the Handwritten Kazakh and Russian (HKR) dataset than the other well-known models.


Sign in / Sign up

Export Citation Format

Share Document