Pen-Based Input for On-Line Handwritten Notation

Author(s):  
Susan E. George

This chapter is concerned with a novel pen-based interface for (handwritten) music notation. The chapter makes a survey of the current scope of on-line (or dynamic) handwritten input of music notation, presenting the outstanding problems in recognition. A solution using the multi-layer perceptron artificial neural network is presented explaining experiments in music symbol recognition from a study involving notation writing from some 25 people using a pressure-sensitive digitiser for input. Results suggest that a voting system among networks trained to recognize individual symbols produces the best recognition rate in the order of 92% for correctly recognising a positive example of a symbol and 98% in correctly rejecting a negative example of the symbol. A discussion is made of how this approach can be used in an interface for a pen-based music editor. The motivation for this chapter includes (i) the practical need for a pen-based interface capable of recognizing unconstrained handwritten music notation, (ii) the theoretical challenges that such a task presents for pattern recognition and (iii) the outstanding neglect of this topic in both academic and commercial respects.

2020 ◽  
pp. 002029402096482
Author(s):  
Sulaiman Khan ◽  
Abdul Hafeez ◽  
Hazrat Ali ◽  
Shah Nazir ◽  
Anwar Hussain

This paper presents an efficient OCR system for the recognition of offline Pashto isolated characters. The lack of an appropriate dataset makes it challenging to match against a reference and perform recognition. This research work addresses this problem by developing a medium-size database that comprises 4488 samples of handwritten Pashto character; that can be further used for experimental purposes. In the proposed OCR system the recognition task is performed using convolution neural network. The performance analysis of the proposed OCR system is validated by comparing its results with artificial neural network and support vector machine based on zoning feature extraction technique. The results of the proposed experiments shows an accuracy of 56% for the support vector machine, 78% for artificial neural network, and 80.7% for the proposed OCR system. The high recognition rate shows that the OCR system based on convolution neural network performs best among the used techniques.


Sensors ◽  
2021 ◽  
Vol 21 (15) ◽  
pp. 5188
Author(s):  
Mitsugu Hasegawa ◽  
Daiki Kurihara ◽  
Yasuhiro Egami ◽  
Hirotaka Sakaue ◽  
Aleksandar Jemcov

An artificial neural network (ANN) was constructed and trained for predicting pressure sensitivity using an experimental dataset consisting of luminophore content and paint thickness as chemical and physical inputs. A data augmentation technique was used to increase the number of data points based on the limited experimental observations. The prediction accuracy of the trained ANN was evaluated by using a metric, mean absolute percentage error. The ANN predicted pressure sensitivity to luminophore content and to paint thickness, within confidence intervals based on experimental errors. The present approach of applying ANN and the data augmentation has the potential to predict pressure-sensitive paint (PSP) characterizations that improve the performance of PSP for global surface pressure measurements.


Author(s):  
Jung-eui Hong ◽  
Cihan H. Dagli ◽  
Kenneth M. Ragsdell

Abstract The primary function of the Wheatstone bridge is to measure an unknown resistance. The elements of this well-known measurement circuit will take on different values depending upon the range and accuracy required for a particular application. The Taguchi approach to parameter design is used to select values for the measurement circuit elements so as to reduce measurement error. Next we introduce the use of an artificial neural network to extrapolate limited experimental results to predict system response over a wide range of applications. This approach can be employed for on-line quality control of the manufacture of such device.


Sign in / Sign up

Export Citation Format

Share Document