elicitation study
Recently Published Documents


TOTAL DOCUMENTS

174
(FIVE YEARS 80)

H-INDEX

19
(FIVE YEARS 3)

2021 ◽  
Vol 6 ◽  
Author(s):  
Francesco-Alessio Ursini ◽  
Qi Rao ◽  
Yue Sara Zhang

The goal of this paper is to offer an overview of polysemy patterns in Mandarin’s chief spatial categories: prepositions (e.g., zai) and simple and compound localisers (respectively, qian and qian-mian). The paper presents data from an elicitation study that shows how speakers can access multiple senses and hyponymy relations for the vocabulary items belonging to these categories. The paper shows that while prepositions can potentially cover different spatial relations in the opportune context (e.g., zai “at”), localisers select increasingly specific senses (e.g., qian “front” and qian-mian “front side”). The paper also shows how speakers can access hyponym-like sense relations emerging from these patterns (e.g., qian-bian covering a more specific sense than qian). Semantic dimensions such as “distance” and “location type” determine the strength of these hyponymy relations. The paper offers an account of these data based on the “semantics maps” model, which captures polysemy and hyponymy patterns via the clusters of locations they refer to. It is shown that this novel model is consistent with previous works on the polysemy of spatial categories and sheds light on how Mandarin offers a unique organisation of this domain.


Surgery ◽  
2021 ◽  
Author(s):  
Samantha J. Rivard ◽  
C. Ann Vitous ◽  
Michaela C. Bamdad ◽  
Alisha Lussiez ◽  
Maia S. Anderson ◽  
...  

2021 ◽  
Vol 5 (ISS) ◽  
pp. 1-23
Author(s):  
Marco Moran-Ledesma ◽  
Oliver Schneider ◽  
Mark Hancock

When interacting with virtual reality (VR) applications like CAD and open-world games, people may want to use gestures as a means of leveraging their knowledge from the physical world. However, people may prefer physical props over handheld controllers to input gestures in VR. We present an elicitation study where 21 participants chose from 95 props to perform manipulative gestures for 20 CAD-like and open-world game-like referents. When analyzing this data, we found existing methods for elicitation studies were insufficient to describe gestures with props, or to measure agreement with prop selection (i.e., agreement between sets of items). We proceeded by describing gestures as context-free grammars, capturing how different props were used in similar roles in a given gesture. We present gesture and prop agreement scores using a generalized agreement score that we developed to compare multiple selections rather than a single selection. We found that props were selected based on their resemblance to virtual objects and the actions they afforded; that gesture and prop agreement depended on the referent, with some referents leading to similar gesture choices, while others led to similar prop choices; and that a small set of carefully chosen props can support multiple gestures.


Symmetry ◽  
2021 ◽  
Vol 13 (10) ◽  
pp. 1926
Author(s):  
Yiqi Xiao ◽  
Ke Miao ◽  
Chenhan Jiang

A stroke is the basic limb movement that both humans and animals naturally and repetitiously perform. Having been introduced into gestural interaction, mid-air stroke gestures saw a wide application range and quite intuitive use. In this paper, we present an approach for building command-to-gesture mapping that exploits the semantic association between interactive commands and the directions of mid-air unistroke gestures. Directional unistroke gestures make use of the symmetry of the semantics of commands, which makes a more systematic gesture set for users’ cognition and reduces the number of gestures users need to learn. However, the learnability of the directional unistroke gestures is varying with different commands. Through a user elicitation study, a gesture set containing eight directional mid-air unistroke gestures was selected by subjective ratings of the direction in respect to its association degree with the corresponding command. We evaluated this gesture set in a following study to investigate the learnability issue, and the directional mid-air unistroke gestures and user-preferred freehand gestures were compared. Our findings can offer preliminary evidence that “return”, “save”, “turn-off” and “mute” are the interaction commands more applicable to using directional mid-air unistrokes, which may have implication for the design of mid-air gestures in human–computer interaction.


Sign in / Sign up

Export Citation Format

Share Document