scholarly journals Towards real-world neuroscience using mobile EEG and augmented reality

2021 ◽  
Author(s):  
Alexandra Krugliak ◽  
Alex Clarke

AbstractOur visual environment impacts multiple aspects of cognition including perception, attention and memory, yet most studies traditionally remove or control the external environment. As a result, we have a limited understanding of neurocognitive processes beyond the controlled lab environment. Here, we aim to study neural processes in real-world environments, while also maintaining a degree of control over perception. To achieve this, we combined mobile EEG (mEEG) and augmented reality (AR), which allows us to place virtual objects into the real world. We validated this AR and mEEG approach using a well-characterised cognitive response - the face inversion effect. Participants viewed upright and inverted faces in three EEG tasks (1) a lab-based computer task, (2) walking through an indoor environment while seeing face photographs, and (3) walking through an indoor environment while seeing virtual faces. We find greater low frequency EEG activity for inverted compared to upright faces in all experimental tasks, demonstrating that cognitively relevant signals can be extracted from mEEG and AR paradigms. This was established in both an epoch-based analysis aligned to face events, and a GLM-based approach that incorporates continuous EEG signals and face perception states. Together, this research helps pave the way to exploring neurocognitive processes in real-world environments while maintaining experimental control using AR.

2013 ◽  
Vol 25 (3) ◽  
pp. 455-464 ◽  
Author(s):  
Thomas W. James ◽  
Lindsay R. Arcurio ◽  
Jason M. Gold

The face inversion effect has been used as a basis for claims about the specialization of face-related perceptual and neural processes. One of these claims is that the fusiform face area (FFA) is the site of face-specific feature-based and/or configural/holistic processes that are responsible for producing the face inversion effect. However, the studies on which these claims were based almost exclusively used stimulus manipulations of whole faces. Here, we tested inversion effects using single, discrete features and combinations of multiple discrete features, in addition to whole faces, using both behavioral and fMRI measurements. In agreement with previous studies, we found behavioral inversion effects with whole faces and no inversion effects with a single eye stimulus or the two eyes in combination. However, we also found behavioral inversion effects with feature combination stimuli that included features in the top and bottom halves (eyes-mouth and eyes-nose-mouth). Activation in the FFA showed an inversion effect for the whole-face stimulus only, which did not match the behavioral pattern. Instead, a pattern of activation consistent with the behavior was found in the bilateral inferior frontal gyrus, which is a component of the extended face-preferring network. The results appear inconsistent with claims that the FFA is the site of face-specific feature-based and/or configural/holistic processes that are responsible for producing the face inversion effect. They are more consistent with claims that the FFA shows a stimulus preference for whole upright faces.


2010 ◽  
Vol 69 (3) ◽  
pp. 161-167 ◽  
Author(s):  
Jisien Yang ◽  
Adrian Schwaninger

Configural processing has been considered the major contributor to the face inversion effect (FIE) in face recognition. However, most researchers have only obtained the FIE with one specific ratio of configural alteration. It remains unclear whether the ratio of configural alteration itself can mediate the occurrence of the FIE. We aimed to clarify this issue by manipulating the configural information parametrically using six different ratios, ranging from 4% to 24%. Participants were asked to judge whether a pair of faces were entirely identical or different. The paired faces that were to be compared were presented either simultaneously (Experiment 1) or sequentially (Experiment 2). Both experiments revealed that the FIE was observed only when the ratio of configural alteration was in the intermediate range. These results indicate that even though the FIE has been frequently adopted as an index to examine the underlying mechanism of face processing, the emergence of the FIE is not robust with any configural alteration but dependent on the ratio of configural alteration.


2019 ◽  
Vol 2019 (1) ◽  
pp. 237-242
Author(s):  
Siyuan Chen ◽  
Minchen Wei

Color appearance models have been extensively studied for characterizing and predicting the perceived color appearance of physical color stimuli under different viewing conditions. These stimuli are either surface colors reflecting illumination or self-luminous emitting radiations. With the rapid development of augmented reality (AR) and mixed reality (MR), it is critically important to understand how the color appearance of the objects that are produced by AR and MR are perceived, especially when these objects are overlaid on the real world. In this study, nine lighting conditions, with different correlated color temperature (CCT) levels and light levels, were created in a real-world environment. Under each lighting condition, human observers adjusted the color appearance of a virtual stimulus, which was overlaid on a real-world luminous environment, until it appeared the whitest. It was found that the CCT and light level of the real-world environment significantly affected the color appearance of the white stimulus, especially when the light level was high. Moreover, a lower degree of chromatic adaptation was found for viewing the virtual stimulus that was overlaid on the real world.


2018 ◽  
Author(s):  
Kyle Plunkett

This manuscript provides two demonstrations of how Augmented Reality (AR), which is the projection of virtual information onto a real-world object, can be applied in the classroom and in the laboratory. Using only a smart phone and the free HP Reveal app, content rich AR notecards were prepared. The physical notecards are based on Organic Chemistry I reactions and show only a reagent and substrate. Upon interacting with the HP Reveal app, an AR video projection shows the product of the reaction as well as a real-time, hand-drawn curved-arrow mechanism of how the product is formed. Thirty AR notecards based on common Organic Chemistry I reactions and mechanisms are provided in the Supporting Information and are available for widespread use. In addition, the HP Reveal app was used to create AR video projections onto laboratory instrumentation so that a virtual expert can guide the user during the equipment setup and operation.


10.28945/2207 ◽  
2015 ◽  
Vol 10 ◽  
pp. 021-035 ◽  
Author(s):  
Yan Lu ◽  
Joseph T. Chao ◽  
Kevin R. Parker

This project shows a creative approach to the familiar scavenger hunt game. It involved the implementation of an iPhone application, HUNT, with Augmented Reality (AR) capability for the users to play the game as well as an administrative website that game organizers can use to create and make available games for users to play. Using the HUNT mobile app, users will first make a selection from a list of games, and they will then be shown a list of objects that they must seek. Once the user finds a correct object and scans it with the built-in camera on the smartphone, the application will attempt to verify if it is the correct object and then display associated multi-media AR content that may include images and videos overlaid on top of real world views. HUNT not only provides entertaining activities within an environment that players can explore, but the AR contents can serve as an educational tool. The project is designed to increase user involvement by using a familiar and enjoyable game as a basis and adding an educational dimension by incorporating AR technology and engaging and interactive multimedia to provide users with facts about the objects that they have located


Sign in / Sign up

Export Citation Format

Share Document