blink detection
Recently Published Documents


TOTAL DOCUMENTS

146
(FIVE YEARS 50)

H-INDEX

12
(FIVE YEARS 2)

2021 ◽  
Vol 10 (6) ◽  
pp. 3032-3041
Author(s):  
Norasyimah Sahat ◽  
Afishah Alias ◽  
Fouziah Md Yassin

Integrated wheelchair controlled by human brainwave using a brain-computer interface (BCI) system was designed to help disabled people. The invention aims to improve the development of integrated wheelchair using a BCI system, depending on the ability individual brain attention level. An electroencephalography (EEG) device called mindwave mobile plus (MW+) has been employed to obtain the attention value for wheelchair movement, eye blink to change the mode of the wheelchair to move forward (F), to the right (R), backward (B) and to the left (L). Stop mode (S) is selected when doing eyebrow movement as the signal quality value of 26 or 51 is produced. The development of the wheelchair controlled by human brainwave using a BCI system for helping a paralyzed patient shows the efficiency of the brainwave integrated wheelchair and improved using human attention value, eye blink detection and eyebrow movement. Also, analysis of the human attention value in different gender and age category also have been done to improve the accuracy of the brainwave integrated wheelchair. The threshold value for male children is 60, male teenager (70), male adult (40) while for female children is 50, female teenager (50) and female adult (30).


Author(s):  
Sree Haran A ◽  
Siyam Adit G ◽  
Vignesh N ◽  
Vimal Athitha S G ◽  
Subash Sakthivel S ◽  
...  

2021 ◽  
Author(s):  
Rahul Dasharath Gavas ◽  
Somnath Karmakar ◽  
Debatri Chatterjee ◽  
Ramesh Kumar Ramakrishnan ◽  
Arpan Pal

2021 ◽  
pp. 116073
Author(s):  
Paulo Augusto de Lima Medeiros ◽  
Gabriel Vinícius Souza da Silva ◽  
Felipe Ricardo dos Santos Fernandes ◽  
Ignacio Sánchez-Gendriz ◽  
Hertz Wilton Castro Lins ◽  
...  

Author(s):  
Patrick Chwalek ◽  
David Ramsay ◽  
Joseph A. Paradiso

We present Captivates, an open-source smartglasses system designed for long-term, in-the-wild psychophysiological monitoring at scale. Captivates integrate many underutilized physiological sensors in a streamlined package, including temple and nose temperature measurement, blink detection, head motion tracking, activity classification, 3D localization, and head pose estimation. Captivates were designed with an emphasis on: (1) manufacturing and scalability, so we can easily support large scale user studies for ourselves and offer the platform as a generalized tool for ambulatory psychophysiology research; (2) robustness and battery life, so long-term studies result in trustworthy data individual's entire day in natural environments without supervision or recharge; and (3) aesthetics and comfort, so people can wear them in their normal daily contexts without self-consciousness or changes in behavior. Captivates are intended to enable large scale data collection without altering user behavior. We validate that our sensors capture useful data robustly for a small set of beta testers. We also show that our additional effort on aesthetics was imperative to meet our goals; namely, earlier versions of our prototype make people uncomfortable to interact naturally in public, and our additional design and miniaturization effort has made a significant impact in preserving natural behavior. There is tremendous promise in translating psychophysiological laboratory techniques into real-world insight. Captivates serve as an open-source bridge to this end. Paired with an accurate underlying model, Captivates will be able to quantify the long-term psychological impact of our design decisions and provide real-time feedback for technologists interested in actuating a cognitively adaptive, user-aligned future.


2021 ◽  
Vol 5 (9) ◽  
pp. 50
Author(s):  
Wenping Luo ◽  
Jianting Cao ◽  
Kousuke Ishikawa ◽  
Dongying Ju

This paper presents a practical human-computer interaction system for wheelchair motion through eye tracking and eye blink detection. In this system, the pupil in the eye image has been extracted after binarization, and the center of the pupil was localized to capture the trajectory of eye movement and determine the direction of eye gaze. Meanwhile, convolutional neural networks for feature extraction and classification of open-eye and closed-eye images have been built, and machine learning was performed by extracting features from multiple individual images of open-eye and closed-eye states for input to the system. As an application of this human-computer interaction control system, experimental validation was carried out on a modified wheelchair and the proposed method proved to be effective and reliable based on the experimental results.


Sensors ◽  
2021 ◽  
Vol 21 (14) ◽  
pp. 4895
Author(s):  
Thanh-Vinh Nguyen ◽  
Masaaki Ichiki

This paper reports on a mask-type sensor for simultaneous pulse wave and respiration measurements and eye blink detection that uses only one sensing element. In the proposed sensor, a flexible air bag-shaped chamber whose inner pressure change can be measured by a microelectromechanical system-based piezoresistive cantilever was used as the sensing element. The air bag-shaped chamber is fabricated by wrapping a sponge pad with plastic film and polyimide tape. The polyimide tape has a hole to which the substrate with the piezoresistive cantilever adheres. By attaching the sensor device to a mask where it contacts the nose of the subject, the sensor can detect the pulses and eye blinks of the subject by detecting the vibration and displacement of the nose skin caused by these physiological parameters. Moreover, the respiration of the subject causes pressure changes in the space between the mask and the face of the subject as well as slight vibrations of the mask. Therefore, information about the respiration of the subject can be extracted from the sensor signal using either the low-frequency component (<1 Hz) or the high-frequency component (>100 Hz). This paper describes the sensor fabrication and provides demonstrations of the pulse wave and respiration measurements as well as eye blink detection using the fabricated sensor.


Author(s):  
Jialin Liu ◽  
Dong Li ◽  
Lei Wang ◽  
Jie Xiong

Eye blink detection plays a key role in many real-life applications such as Human-Computer Interaction (HCI), drowsy driving prevention and eye disease detection. Although traditional camera-based techniques are promising, multiple issues hinder their wide adoption including the privacy concern, strict lighting condition and line-of-sight (LoS) requirements. On the other hand, wireless sensing without a need for dedicated sensors gains a tremendous amount of attention in recent years. Among the wireless signals utilized for sensing, acoustic signals show a unique potential for fine-grained sensing owing to their low propagation speed in the air. Another trend favoring acoustic sensing is the wide availability of speakers and microphones in commodity devices. Promising progress has been achieved in fine-grained human motion sensing such as breathing using acoustic signals. However, it is still very challenging to employ acoustic signals for eye blink detection due to the unique characteristics of eye blink (i.e., subtle, sparse and aperiodic) and severe interference (i.e., from the human target himself and surrounding objects). We find that even the very subtle involuntary head movement induced by breathing can severely interfere with eye blink detection. In this work, for the first time, we propose a system called BlinkListener to sense the subtle eye blink motion using acoustic signals in a contact-free manner. We first quantitatively model the relationship between signal variation and the subtle movements caused by eye blink and interference. Then, we propose a novel method that exploits the "harmful" interference to maximize the subtle signal variation induced by eye blinks. We implement BlinkListener on both a research-purpose platform (Bela) and a commodity smartphone (iPhone 5c). Experiment results show that BlinkListener can achieve robust performance with a median detection accuracy of 95%. Our system can achieve high accuracies when the smartphone is held in hand, the target wears glasses/sunglasses and in the presence of strong interference with people moving around.


Sign in / Sign up

Export Citation Format

Share Document