A Graphics-Recognition System for Interpretation of Line Drawings

2020 ◽  
pp. 37-72
Author(s):  
Sing-tze Bow ◽  
Rangachar Kasturi
2020 ◽  
Vol 2020 ◽  
pp. 1-13
Author(s):  
Shouxia Wang ◽  
Shuxia Wang ◽  
Weiping He

Multistroke drawing occurs frequently in conceptual design sketches; however, it is almost unsupported by the current sketch-based user interfaces. We proposed a sketch recognition system based on the multistroke primitive grouping method. Based on grouping the strokes that lie within the mutual boundaries between adjacent regions, we create line drawings from online freehand axonometric sketches of mechanical models. First, closed regions and their boundary bands of the sketch were extracted. Then, the strokes that cross the boundary bands of two or more closed regions are segmented, and the strokes that lie within the intersection of two adjacent boundary bands are grouped. Finally, grouped strokes are simplified into a new single stroke and then fitted as a geometric primitive; thus, the input sketches are recognized to the line drawings. We developed a prototype of the sketch recognition system to evaluate the proposed method. The results showed that the input sketches are simplified into the accurate line drawings efficiently. The proposed method can be applied to both multistroke overtracing and nonovertracing sketches.


2019 ◽  
Vol 4 (6) ◽  
pp. 1482-1488
Author(s):  
Jennifer J. Thistle

Purpose Previous research with children with and without disabilities has demonstrated that visual–perceptual factors can influence the speech of locating a target on an array. Adults without disabilities often facilitate the learning and use of a child's augmentative and alternative communication system. The current research examined how the presence of symbol background color influenced the speed with which adults without disabilities located target line drawings in 2 studies. Method Both studies used a between-subjects design. In the 1st study, 30 adults (ages 18–29 years) located targets in a 16-symbol array. In the 2nd study, 30 adults (ages 18–34 years) located targets in a 60-symbol array. There were 3 conditions in each study: symbol background color, symbol background white with a black border, and symbol background white with a color border. Results In the 1st study, reaction times across groups were not significantly different. In the 2nd study, participants in the symbol background color condition were significantly faster than participants in the other conditions, and participants in the symbol background white with black border were significantly slower than participants in the other conditions. Conclusion Communication partners may benefit from the presence of background color, especially when supporting children using displays with many symbols.


Author(s):  
Robert J. Hartsuiker ◽  
Lies Notebaert

A picture naming experiment in Dutch tested whether disfluencies in speech can arise from difficulties in lexical access. Speakers described networks consisting of line drawings and paths connecting these drawings, and we manipulated picture name agreement. Consistent with our hypothesis, there were more pauses and more self-corrections in the low name agreement condition than the high name agreement condition, but there was no effect on repetitions. We also considered determiner frequency. There were more self-corrections and more repetitions when the picture name required the less frequent (neuter-gender) determiner “het” than the more frequent (common-gender) determiner “de”. These data suggest that difficulties in distinct stages of language production result in distinct patterns of disfluencies.


Author(s):  
Toby J. Lloyd-Jones ◽  
Juergen Gehrke ◽  
Jason Lauder

We assessed the importance of outline contour and individual features in mediating the recognition of animals by examining response times and eye movements in an animal-object decision task (i.e., deciding whether or not an object was an animal that may be encountered in real life). There were shorter latencies for animals as compared with nonanimals and performance was similar for shaded line drawings and silhouettes, suggesting that important information for recognition lies in the outline contour. The most salient information in the outline contour was around the head, followed by the lower torso and leg regions. We also observed effects of object orientation and argue that the usefulness of the head and lower torso/leg regions is consistent with a role for the object axis in recognition.


2018 ◽  
Vol 1 (2) ◽  
pp. 34-44
Author(s):  
Faris E Mohammed ◽  
Dr. Eman M ALdaidamony ◽  
Prof. A. M Raid

Individual identification process is a very significant process that resides a large portion of day by day usages. Identification process is appropriate in work place, private zones, banks …etc. Individuals are rich subject having many characteristics that can be used for recognition purpose such as finger vein, iris, face …etc. Finger vein and iris key-points are considered as one of the most talented biometric authentication techniques for its security and convenience. SIFT is new and talented technique for pattern recognition. However, some shortages exist in many related techniques, such as difficulty of feature loss, feature key extraction, and noise point introduction. In this manuscript a new technique named SIFT-based iris and SIFT-based finger vein identification with normalization and enhancement is proposed for achieving better performance. In evaluation with other SIFT-based iris or SIFT-based finger vein recognition algorithms, the suggested technique can overcome the difficulties of tremendous key-point extraction and exclude the noise points without feature loss. Experimental results demonstrate that the normalization and improvement steps are critical for SIFT-based recognition for iris and finger vein , and the proposed technique can accomplish satisfactory recognition performance. Keywords: SIFT, Iris Recognition, Finger Vein identification and Biometric Systems.   © 2018 JASET, International Scholars and Researchers Association    


Sign in / Sign up

Export Citation Format

Share Document