scholarly journals ‘Goats that stare at men’: dwarf goats alter their behaviour in response to human head orientation, but do not spontaneously use head direction as a cue in a food-related context

2014 ◽  
Vol 18 (1) ◽  
pp. 65-73 ◽  
Author(s):  
Christian Nawroth ◽  
Eberhard von Borell ◽  
Jan Langbein
Author(s):  
Zhansheng Xiong ◽  
Zhenhua Wang ◽  
Zheng Wang ◽  
Jianhua Zhang

PeerJ ◽  
2017 ◽  
Vol 5 ◽  
pp. e3073 ◽  
Author(s):  
Christian Nawroth ◽  
Alan G. McElligott

Animals domesticated for working closely with humans (e.g. dogs) have been shown to be remarkable in adjusting their behaviour to human attentional stance. However, there is little evidence for this form of information perception in species domesticated for production rather than companionship. We tested domestic ungulates (goats) for their ability to differentiate attentional states of humans. In the first experiment, we investigated the effect of body and head orientation of one human experimenter on approach behaviour by goats. Test subjects (N = 24) significantly changed their behaviour when the experimenter turned its back to the subjects, but did not take into account head orientation alone. In the second experiment, goats (N = 24) could choose to approach one of two experimenters, while only one was paying attention to them. Goats preferred to approach humans that oriented their body and head towards the subject, whereas head orientation alone had no effect on choice behaviour. In the third experiment, goats (N = 32) were transferred to a separate test arena and were rewarded for approaching two experimenters providing a food reward during training trials. In subsequent probe test trials, goats had to choose between the two experimenters differing in their attentional states. Like in Experiments 1 and 2, goats did not show a preference for the attentive person when the inattentive person turned her head away from the subject. In this last experiment, goats preferred to approach the attentive person compared to a person who closed their eyes or covered the whole face with a blind. However, goats showed no preference when one person covered only the eyes. Our results show that animals bred for production rather than companionship show differences in their approach and choice behaviour depending on human attentive state. However, our results contrast with previous findings regarding the use of the head orientation to attribute attention and show the importance of cross-validating results.


2019 ◽  
Vol 122 (3) ◽  
pp. 1274-1287 ◽  
Author(s):  
Jean Laurens ◽  
Dora E. Angelaki

In a recent study, Shinder and Taube (Shinder ME, Taube JS. J Neurophysiol 121: 4–37, 2019) concluded that head direction cells in the anterior thalamus of rats are tuned to one-dimensional (1D, yaw-only) motion, in contrast to recent findings in bats, mice, and rats. Here we reinterpret the author’s experimental results using model comparison and demonstrate that, contrary to their conclusions, experimental data actually supports the dual-axis rule (lson JJ, Jeffery KJ. J Neurophysiol 119: 192–208, 2018) and tilted azimuth model (Laurens J, Angelaki DE. Neuron 97: 275–289, 2018), where head direction cells use gravity to integrate 3D rotation signals about all cardinal axes of the head. We further show that the Shinder and Taube study is inconclusive regarding the presence of vertical orientation tuning; i.e., whether head direction cells encode 3D orientation in the horizontal and vertical planes conjunctively. Using model simulations, we demonstrate that, even if 3D tuning existed, the experimental protocol and data analyses used by Shinder and Taube would not have revealed it. We conclude that the actual experimental data of Shinder and Taube are compatible with the 3D properties of head direction cells discovered by other groups, yet incorrect conclusions were reached because of incomplete and qualitative analyses. NEW & NOTEWORTHY We conducted a model-based analysis previously published data where rat head direction cells were recorded during three-dimensional motion (Shinder ME, Taube JS. J Neurophysiol 121: 4–37, 2019). We found that these data corroborate previous models (“dual-axis rule,” Page HJI, Wilson JJ, Jeffery KJ. J Neurophysiol 119: 192–208, 2018; and “tilted azimuth model,” Laurens J, Angelaki DE. Neuron 97: 275–289, 2018) where head direction cells integrate rotations along all three head axes to encode head orientation in a gravity-anchored reference frame.


Author(s):  
Stephanie Tan ◽  
David M. J. Tax ◽  
Hayley Hung

Human head orientation estimation has been of interest because head orientation serves as a cue to directed social attention. Most existing approaches rely on visual and high-fidelity sensor inputs and deep learning strategies that do not consider the social context of unstructured and crowded mingling scenarios. We show that alternative inputs, like speaking status, body location, orientation, and acceleration contribute towards head orientation estimation. These are especially useful in crowded and in-the-wild settings where visual features are either uninformative due to occlusions or prohibitive to acquire due to physical space limitations and concerns of ecological validity. We argue that head orientation estimation in such social settings needs to account for the physically evolving interaction space formed by all the individuals in the group. To this end, we propose an LSTM-based head orientation estimation method that combines the hidden representations of the group members. Our framework jointly predicts head orientations of all group members and is applicable to groups of different sizes. We explain the contribution of different modalities to model performance in head orientation estimation. The proposed model outperforms baseline methods that do not explicitly consider the group context, and generalizes to an unseen dataset from a different social event.


2016 ◽  
Vol 19 (3) ◽  
pp. 667-672 ◽  
Author(s):  
Christian Nawroth ◽  
Eberhard von Borell ◽  
Jan Langbein
Keyword(s):  

2011 ◽  
Vol 82 (3) ◽  
pp. 165-176 ◽  
Author(s):  
Jennifer L. Botting ◽  
Mallory L. Wiper ◽  
James R. Anderson

Sign in / Sign up

Export Citation Format

Share Document