Heads up: Head movements during ad exposure respond to consumer goals and predict brand memory

2020 ◽  
Vol 111 ◽  
pp. 281-289
Author(s):  
Rik Pieters ◽  
Michel Wedel
1999 ◽  
Vol 58 (3) ◽  
pp. 170-179 ◽  
Author(s):  
Barbara S. Muller ◽  
Pierre Bovet

Twelve blindfolded subjects localized two different pure tones, randomly played by eight sound sources in the horizontal plane. Either subjects could get information supplied by their pinnae (external ear) and their head movements or not. We found that pinnae, as well as head movements, had a marked influence on auditory localization performance with this type of sound. Effects of pinnae and head movements seemed to be additive; the absence of one or the other factor provoked the same loss of localization accuracy and even much the same error pattern. Head movement analysis showed that subjects turn their face towards the emitting sound source, except for sources exactly in the front or exactly in the rear, which are identified by turning the head to both sides. The head movement amplitude increased smoothly as the sound source moved from the anterior to the posterior quadrant.


2004 ◽  
Vol 35 (03) ◽  
Author(s):  
P Wagner ◽  
J Cunha ◽  
C Mauerer ◽  
C Vollmar ◽  
B Feddersen ◽  
...  
Keyword(s):  

2003 ◽  
Vol 10 (5) ◽  
pp. 862-869 ◽  
Author(s):  
A. W. Floris Vos ◽  
Matteus A. M. Linsen ◽  
J. Tim Marcus ◽  
Jos C. van den Berg ◽  
Jan Albert Vos ◽  
...  

2007 ◽  
Vol 29 (3) ◽  
pp. 205-212
Author(s):  
Junko Fukushima ◽  
Tadayoshi Asaka ◽  
Natsumi Ikeda ◽  
Yumi Ito

Author(s):  
F. Boehm ◽  
P. J. Schuler ◽  
R. Riepl ◽  
L. Schild ◽  
T. K. Hoffmann ◽  
...  

AbstractMicrovascular procedures require visual magnification of the surgical field, e.g. by a microscope. This can be accompanied by an unergonomic posture with musculoskeletal pain or long-term degenerative changes as the eye is bound to the ocular throughout the whole procedure. The presented study describes the advantages and drawbacks of a 3D exoscope camera system. The RoboticScope®-system (BHS Technologies®, Innsbruck, Austria) features a high-resolution 3D-camera that is placed over the surgical field and a head-mounted-display (HMD) that the camera pictures are transferred to. A motion sensor in the HMD allows for hands-free change of the exoscope position via head movements. For general evaluation of the system functions coronary artery anastomoses of ex-vivo pig hearts were performed. Second, the system was evaluated for anastomosis of a radial-forearm-free-flap in a clinical setting/in vivo. The system positioning was possible entirely hands-free using head movements. Camera control was intuitive; visualization of the operation site was adequate and independent from head or body position. Besides technical instructions of the providing company, there was no special surgical training of the surgeons or involved staff upfront performing the procedures necessary. An ergonomic assessment questionnaire showed a favorable ergonomic position in comparison to surgery with a microscope. The outcome of the operated patient was good. There were no intra- or postoperative complications. The exoscope facilitates a change of head and body position without losing focus of the operation site and an ergonomic working position. Repeated applications have to clarify if the system benefits in clinical routine.


2021 ◽  
Author(s):  
Valentin Holzwarth ◽  
Johannes Schneider ◽  
Joshua Handali ◽  
Joy Gisler ◽  
Christian Hirt ◽  
...  

AbstractInferring users’ perceptions of Virtual Environments (VEs) is essential for Virtual Reality (VR) research. Traditionally, this is achieved through assessing users’ affective states before and after being exposed to a VE, based on standardized, self-assessment questionnaires. The main disadvantage of questionnaires is their sequential administration, i.e., a user’s affective state is measured asynchronously to its generation within the VE. A synchronous measurement of users’ affective states would be highly favorable, e.g., in the context of adaptive systems. Drawing from nonverbal behavior research, we argue that behavioral measures could be a powerful approach to assess users’ affective states in VR. In this paper, we contribute by providing methods and measures evaluated in a user study involving 42 participants to assess a users’ affective states by measuring head movements during VR exposure. We show that head yaw significantly correlates with presence, mental and physical demand, perceived performance, and system usability. We also exploit the identified relationships for two practical tasks that are based on head yaw: (1) predicting a user’s affective state, and (2) detecting manipulated questionnaire answers, i.e., answers that are possibly non-truthful. We found that affective states can be predicted significantly better than a naive estimate for mental demand, physical demand, perceived performance, and usability. Further, manipulated or non-truthful answers can also be estimated significantly better than by a naive approach. These findings mark an initial step in the development of novel methods to assess user perception of VEs.


Sign in / Sign up

Export Citation Format

Share Document