Preliminary classification of cognitive load states in a human machine interaction scenario

Author(s):  
Andreas Oschlies-Strobel ◽  
Sascha Gruss ◽  
Lucia Jerg-Bretzke ◽  
Steffen Walter ◽  
Dilana Hazer-Rau
2021 ◽  
Vol 8 ◽  
Author(s):  
Franz A. Van-Horenbeke ◽  
Angelika Peer

Recognizing the actions, plans, and goals of a person in an unconstrained environment is a key feature that future robotic systems will need in order to achieve a natural human-machine interaction. Indeed, we humans are constantly understanding and predicting the actions and goals of others, which allows us to interact in intuitive and safe ways. While action and plan recognition are tasks that humans perform naturally and with little effort, they are still an unresolved problem from the point of view of artificial intelligence. The immense variety of possible actions and plans that may be encountered in an unconstrained environment makes current approaches be far from human-like performance. In addition, while very different types of algorithms have been proposed to tackle the problem of activity, plan, and goal (intention) recognition, these tend to focus in only one part of the problem (e.g., action recognition), and techniques that address the problem as a whole have been not so thoroughly explored. This review is meant to provide a general view of the problem of activity, plan, and goal recognition as a whole. It presents a description of the problem, both from the human perspective and from the computational perspective, and proposes a classification of the main types of approaches that have been proposed to address it (logic-based, classical machine learning, deep learning, and brain-inspired), together with a description and comparison of the classes. This general view of the problem can help on the identification of research gaps, and may also provide inspiration for the development of new approaches that address the problem in a unified way.


2021 ◽  
pp. 1-9
Author(s):  
Harshadkumar B. Prajapati ◽  
Ankit S. Vyas ◽  
Vipul K. Dabhi

Face expression recognition (FER) has gained very much attraction to researchers in the field of computer vision because of its major usefulness in security, robotics, and HMI (Human-Machine Interaction) systems. We propose a CNN (Convolutional Neural Network) architecture to address FER. To show the effectiveness of the proposed model, we evaluate the performance of the model on JAFFE dataset. We derive a concise CNN architecture to address the issue of expression classification. Objective of various experiments is to achieve convincing performance by reducing computational overhead. The proposed CNN model is very compact as compared to other state-of-the-art models. We could achieve highest accuracy of 97.10% and average accuracy of 90.43% for top 10 best runs without any pre-processing methods applied, which justifies the effectiveness of our model. Furthermore, we have also included visualization of CNN layers to observe the learning of CNN.


Author(s):  
Xiaochen Zhang ◽  
Lanxin Hui ◽  
Linchao Wei ◽  
Fuchuan Song ◽  
Fei Hu

Electric power wheelchairs (EPWs) enhance the mobility capability of the elderly and the disabled, while the human-machine interaction (HMI) determines how well the human intention will be precisely delivered and how human-machine system cooperation will be efficiently conducted. A bibliometric quantitative analysis of 1154 publications related to this research field, published between 1998 and 2020, was conducted. We identified the development status, contributors, hot topics, and potential future research directions of this field. We believe that the combination of intelligence and humanization of an EPW HMI system based on human-machine collaboration is an emerging trend in EPW HMI methodology research. Particular attention should be paid to evaluating the applicability and benefits of the EPW HMI methodology for the users, as well as how much it contributes to society. This study offers researchers a comprehensive understanding of EPW HMI studies in the past 22 years and latest trends from the evolutionary footprints and forward-thinking insights regarding future research.


Sign in / Sign up

Export Citation Format

Share Document