Multimodal Human Localization Using Bayesian Network Sensor Fusion

2007 ◽  
pp. 194-221 ◽  
Author(s):  
David Lo

In applications where the locations of human subjects are needed, for example, human-computer interface, video conferencing, and security surveillance applications, localizations are often performed using single sensing modalities. These mono localization modalities, such as beamforming microphone array and video-graphical localization techniques, are often prone to errors. In this chapter, a modular multimodal localization framework was constructed by combining multiple mono localization modalities using a Bayesian network. As a case study, a joint audio-video talker localization system for the video conferencing application was presented. Based on the results, the proposed multimodal localization method outperforms localization methods, in terms of accuracy and robustness, when compare with mono modal modalities that rely only on audio or video.

Author(s):  
Niamat Ullah Ibne Hossain ◽  
Raed Jaradat ◽  
Seyedmohsen Hosseini ◽  
Mohammad Marufuzzaman ◽  
Randy K. Buchanan

Author(s):  
Yang Gao ◽  
Yincheng Jin ◽  
Seokmin Choi ◽  
Jiyang Li ◽  
Junjie Pan ◽  
...  

Accurate recognition of facial expressions and emotional gestures is promising to understand the audience's feedback and engagement on the entertainment content. Existing methods are primarily based on various cameras or wearable sensors, which either raise privacy concerns or demand extra devices. To this aim, we propose a novel ubiquitous sensing system based on the commodity microphone array --- SonicFace, which provides an accessible, unobtrusive, contact-free, and privacy-preserving solution to monitor the user's emotional expressions continuously without playing hearable sound. SonicFace utilizes a pair of speaker and microphone array to recognize various fine-grained facial expressions and emotional hand gestures by emitted ultrasound and received echoes. Based on a set of experimental evaluations, the accuracy of recognizing 6 common facial expressions and 4 emotional gestures can reach around 80%. Besides, the extensive system evaluations with distinct configurations and an extended real-life case study have demonstrated the robustness and generalizability of the proposed SonicFace system.


2021 ◽  
Vol 13 (2-3) ◽  
pp. 163-179 ◽  
Author(s):  
Adam Martin ◽  
Morten Büchert

Online collaboration between musicians in 2020 is a rapidly developing practice due to a range of environmental, epidemiological and creative motivations. The technical facility to collaborate in a variety of different formats exists via file-sharing services, video conferencing suites and specialist music services such as Splice and Auddly. Yet, given this proliferation of technologies, little attention has been paid into how creative musicians can most meaningfully utilize these new collaborative opportunities within their working practice. In this article, we wish to share some reflections from a case study of online music collaboration gained through our experience of facilitating three online songwriting camps with students from Leeds Conservatoire in the United Kingdom and Rhythmic Music Conservatory in Denmark. This article will particularly focus on the importance of managing roles, the impact of communication tools and the requirement for time management when collaborating online before proposing a set of guidelines derived from this study to help enable productive online creative collaboration.


2018 ◽  
Vol 42 (3) ◽  
pp. 358-385 ◽  
Author(s):  
Natalie Todak ◽  
Michael D. White ◽  
Lisa M. Dario ◽  
Andrea R. Borrego

Objective: To provide guidance to criminologists for conducting experiments in light of two common discouraging factors: the belief that they are overly time-consuming and the belief that they can compromise the ethical principles of human subjects’ research. Method: A case study approach is used, based on a large-scale randomized controlled trial experiment in which we exposed participants to a 5-s TASER shock, to describe how the authors overcame ethical, methodological, and logistical difficulties. Results: We derive four pieces of advice from our experiences carrying out this experimental trial: (1) know your limitations, (2) employ pilot testing, (3) remain flexible and patient, and (4) “hold the line” to maintain the integrity of the research and the safety of human subjects. Conclusions: Criminologists have an obligation to provide the best possible evidence regarding the impact and consequences of criminal justice practices and programs. Experiments, considered by many to be the gold standard of empirical research methodologies, should be used whenever possible in order to fulfill this obligation.


2017 ◽  
Vol 20 (1) ◽  
pp. 107-114 ◽  
Author(s):  
Jennifer Kue ◽  
Laura A. Szalacha ◽  
Mary Beth Happ ◽  
Abigail L. Crisp ◽  
Usha Menon

2021 ◽  
Author(s):  
Vytautas Zalys

The emerging of digital technology not only encourages the development of new tools but also changes traditional approaches to solving emerging problems. The sound, music, art, colors, etc. that prevailed in the 20th-century forms of therapy are being replaced by integrated systems that overcome many of these forms, thanks to digital technology. With the increasing number of people with autism spectrum disorder (ASD) in the world, such systems provide new opportunities for the treatment of these disorders. In this research, the creation of such a system has been chosen as the object of work. The article presents an interactive tool for the education of children with ASD created by audio, video, and computer technologies and assesses its potential impact. The experimental research and its results are presented. This study aims to evaluate an interactive instrument developed for the education of such children. Following the objectives of ensuring the interactivity of the process, provoking all the perceptions of the subject, and developing the subject's ability to respond to the environment, a personalized audiovisual environment was created. For interactivity, the virtual program EyeCon, Webcam and camcorders, video projector, and speaker system were used. The study was conducted with one subject and a case study method was used. The impact of the instrument was established based on a survey of the parents of the child and the findings of childcare experts. The results of the study demonstrated the positive benefits for this child such as increased eye-to-hand coordination, concentration duration, improved communication, and emotional expression. The results obtained show that such interactive multi-sensory environments in special and general education schools can be a supplemental tool for traditional methods.


Sign in / Sign up

Export Citation Format

Share Document