Articulating Interaction and Task Models for the Design of Advanced Interactive Systems

Author(s):  
Syrine Charfi ◽  
Emmanuel Dubois ◽  
Remi Bastide
Keyword(s):  
2019 ◽  
Vol 9 (19) ◽  
pp. 4106
Author(s):  
Ricardo Cruz ◽  
Luis A. Pineda

Optimal user experience or flow is a theory with great impact on user experience. Promoting flow has become a competitive advantage for interactive systems, including rehabilitation. This can be achieved through an engaging interface that provides a rewarding experience and motivates the user to use the system again. This theory sustains that promoting a state of flow and improving task performance depends heavily on the balance between the challenges posed by the system and the skills deployed by the user. We further claim that balanced mental and motor skills demanded by the task improve flow and task performance. This paper presents an experiment supporting these claims. For this, we built two movement-interaction rehabilitation systems called SIBMER and Macoli (arm in Náhuatl). Both systems have two versions, one with a balanced load of mental and motor skills, and the other with an unbalanced one. Both versions are compared in terms of their potential to promote the state of flow and to improve task performance. Results show that a balance demand of mental and motor skills promotes flow, independently of the task complexity. Likewise, the experiment shows a correlation between flow and performance.


Author(s):  
Qiyang Chen ◽  
John Wang

To adapt users’ input and tasks an interactive system must be able to establish a set of assumptions about users’ profiles and task characteristics, which is often referred as user models. However, to develop a user model an interactive system needs to analyze users’ input and recognize the tasks and the ultimate goals users trying to achieve, which may involve a great deal of uncertainties. In this chapter the approaches for handling uncertainty are reviewed and analyzed. The purpose is to provide an analytical overview and perspective concerning the major methods that have been proposed to cope with uncertainties.


Author(s):  
Margreet Vogelzang ◽  
Christiane M. Thiel ◽  
Stephanie Rosemann ◽  
Jochem W. Rieger ◽  
Esther Ruigendijk

Purpose Adults with mild-to-moderate age-related hearing loss typically exhibit issues with speech understanding, but their processing of syntactically complex sentences is not well understood. We test the hypothesis that listeners with hearing loss' difficulties with comprehension and processing of syntactically complex sentences are due to the processing of degraded input interfering with the successful processing of complex sentences. Method We performed a neuroimaging study with a sentence comprehension task, varying sentence complexity (through subject–object order and verb–arguments order) and cognitive demands (presence or absence of a secondary task) within subjects. Groups of older subjects with hearing loss ( n = 20) and age-matched normal-hearing controls ( n = 20) were tested. Results The comprehension data show effects of syntactic complexity and hearing ability, with normal-hearing controls outperforming listeners with hearing loss, seemingly more so on syntactically complex sentences. The secondary task did not influence off-line comprehension. The imaging data show effects of group, sentence complexity, and task, with listeners with hearing loss showing decreased activation in typical speech processing areas, such as the inferior frontal gyrus and superior temporal gyrus. No interactions between group, sentence complexity, and task were found in the neuroimaging data. Conclusions The results suggest that listeners with hearing loss process speech differently from their normal-hearing peers, possibly due to the increased demands of processing degraded auditory input. Increased cognitive demands by means of a secondary visual shape processing task influence neural sentence processing, but no evidence was found that it does so in a different way for listeners with hearing loss and normal-hearing listeners.


2019 ◽  
Vol 62 (12) ◽  
pp. 4417-4432 ◽  
Author(s):  
Carola de Beer ◽  
Jan P. de Ruiter ◽  
Martina Hielscher-Fastabend ◽  
Katharina Hogrefe

Purpose People with aphasia (PWA) use different kinds of gesture spontaneously when they communicate. Although there is evidence that the nature of the communicative task influences the linguistic performance of PWA, so far little is known about the influence of the communicative task on the production of gestures by PWA. We aimed to investigate the influence of varying communicative constraints on the production of gesture and spoken expression by PWA in comparison to persons without language impairment. Method Twenty-six PWA with varying aphasia severities and 26 control participants (CP) without language impairment participated in the study. Spoken expression and gesture production were investigated in 2 different tasks: (a) spontaneous conversation about topics of daily living and (b) a cartoon narration task, that is, retellings of short cartoon clips. The frequencies of words and gestures as well as of different gesture types produced by the participants were analyzed and tested for potential effects of group and task. Results Main results for task effects revealed that PWA and CP used more iconic gestures and pantomimes in the cartoon narration task than in spontaneous conversation. Metaphoric gestures, deictic gestures, number gestures, and emblems were more frequently used in spontaneous conversation than in cartoon narrations by both participant groups. Group effects show that, in both tasks, PWA's gesture-to-word ratios were higher than those for the CP. Furthermore, PWA produced more interactive gestures than the CP in both tasks, as well as more number gestures and pantomimes in spontaneous conversation. Conclusions The current results suggest that PWA use gestures to compensate for their verbal limitations under varying communicative constraints. The properties of the communicative task influence the use of different gesture types in people with and without aphasia. Thus, the influence of communicative constraints needs to be considered when assessing PWA's multimodal communicative abilities.


Author(s):  
Solène Ambrosi ◽  
Patrick Lemaire ◽  
Agnès Blaye

Abstract. Dynamic, trial-by-trial modulations of inhibitory control are well documented in adults but rarely investigated in children. Here, we examined whether 5-to-7 year-old children, an age range when inhibitory control is still partially immature, achieve such modulations. Fifty three children took flanker, Simon, and Stroop tasks. Above and beyond classic congruency effects, the present results showed two crucial findings. First, we found evidence for sequential modulations of congruency effects in these young children in the three conflict tasks. Second, our results showed both task specificities and task commonalities. These findings in young children have important implications as they suggest that, to be modulated, inhibitory control does not require full maturation and that the precise pattern of trial-by-trial modulations may depend on the nature of conflict.


2019 ◽  
Vol 13 (3) ◽  
pp. 314-321 ◽  
Author(s):  
Hansika Kapoor ◽  
Azizuddin Khan
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document