Fast writer adaptation with style extractor network for handwritten text recognition

2021 ◽  
Author(s):  
Zi-Rui Wang ◽  
Jun Du
Author(s):  
Sri. Yugandhar Manchala ◽  
Jayaram Kinthali ◽  
Kowshik Kotha ◽  
Kanithi Santosh Kumar, Jagilinki Jayalaxmi ◽  

2021 ◽  
Author(s):  
Ayan Kumar Bhunia ◽  
Shuvozit Ghose ◽  
Amandeep Kumar ◽  
Pinaki Nath Chowdhury ◽  
Aneeshan Sain ◽  
...  

2020 ◽  
Vol 6 (12) ◽  
pp. 141
Author(s):  
Abdelrahman Abdallah ◽  
Mohamed Hamada ◽  
Daniyar Nurseitov

This article considers the task of handwritten text recognition using attention-based encoder–decoder networks trained in the Kazakh and Russian languages. We have developed a novel deep neural network model based on a fully gated CNN, supported by multiple bidirectional gated recurrent unit (BGRU) and attention mechanisms to manipulate sophisticated features that achieve 0.045 Character Error Rate (CER), 0.192 Word Error Rate (WER), and 0.253 Sequence Error Rate (SER) for the first test dataset and 0.064 CER, 0.24 WER and 0.361 SER for the second test dataset. Our proposed model is the first work to handle handwriting recognition models in Kazakh and Russian languages. Our results confirm the importance of our proposed Attention-Gated-CNN-BGRU approach for training handwriting text recognition and indicate that it can lead to statistically significant improvements (p-value < 0.05) in the sensitivity (recall) over the tests dataset. The proposed method’s performance was evaluated using handwritten text databases of three languages: English, Russian, and Kazakh. It demonstrates better results on the Handwritten Kazakh and Russian (HKR) dataset than the other well-known models.


2020 ◽  
Vol 14 (11) ◽  
pp. 2291-2300
Author(s):  
Mujtaba Husnain ◽  
Malik Muhammad Saad Missen ◽  
Shahzad Mumtaz ◽  
Mickaël Coustaty ◽  
Muzzamil Luqman ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document