Design and Research of Human-Machine Interaction System of Embedded Precision CNC Internal Grinder

2011 ◽  
Vol 230-232 ◽  
pp. 136-139
Author(s):  
Ou Xie ◽  
Hua Li ◽  
Zhen Yin

A design of Human-machine interaction system of embedded precision CNC internal grinder which based on touch screen is proposed. The master-slave two-stage control mode is used in the system. By developing interface software, the system achieves the integration interactive control for the internal grinding. The performance of Human-machine interaction is improved and the processing efficiency and communication capabilities are increased.

Nano Energy ◽  
2019 ◽  
Vol 64 ◽  
pp. 103953 ◽  
Author(s):  
Baosen Zhang ◽  
Yingjie Tang ◽  
Ranran Dai ◽  
Hongyi Wang ◽  
Xiupeng Sun ◽  
...  

2020 ◽  
Vol 20 (24) ◽  
pp. 14950-14957
Author(s):  
Shuqin Yang ◽  
Li Xing ◽  
Wenhui Liu ◽  
Weixing Qian ◽  
Wenyan Qian ◽  
...  

2020 ◽  
Vol 12 (4) ◽  
pp. 1016-1046
Author(s):  
Hanif Fakhrurroja ◽  
◽  
Carmadi Machbub ◽  
Ary Setijadi Prihatmanto ◽  
Ayu Purwarianti ◽  
...  

Studies on human-machine interaction system show positive results on system development accuracy. However, there are problems, especially using certain input modalities such as speech, gesture, face detection, and skeleton tracking. These problems include how to design an interface system for a machine to contextualize the existing conversations. Other problems include activating the system using various modalities, right multimodal fusion methods, machine understanding of human intentions, and methods for developing knowledge. This study developed a method of human-machine interaction system. It involved several stages, including a multimodal activation system, methods for recognizing speech modalities, gestures, face detection and skeleton tracking, multimodal fusion strategies, understanding human intent and Indonesian dialogue systems, as well as machine knowledge development methods and the right response. The research contributes to an easier and more natural humanmachine interaction system using multimodal fusion-based systems. The average accuracy rate of multimodal activation, testing dialogue system using Indonesian, gesture recognition interaction, and multimodal fusion is 87.42%, 92.11%, 93.54% and 93%, respectively. The level of user satisfaction towards the multimodal recognition-based human-machine interaction system developed was 95%. According to 76.2% of users, this interaction system was natural, while 79.4% agreed that the machine responded well to their wishes.


2016 ◽  
pp. 670-704
Author(s):  
Igor Bisio ◽  
Alessandro Delfino ◽  
Fabio Lavagetto ◽  
Mario Marchese

Human-machine interaction is performed by devices such as the keyboard, the touch-screen, or speech-to-text applications. For example, a speech-to-text application is software that allows the device to translate the spoken words into text. These tools translate explicit messages but ignore implicit messages, such as the emotional status of the speaker, filtering out a portion of information available in the interaction process. This chapter focuses on emotion detection. An emotion-aware device can also interact more personally with its owner and react appropriately according to the user's mood, making the user-machine interaction less stressful. The chapter gives the guidelines for building emotion-aware smartphone applications in an opportunistic way (i.e., without the user's collaboration). In general, smartphone applications might be employed in different contexts; therefore, the to-be-detected emotions might be different.


Sign in / Sign up

Export Citation Format

Share Document