Real-time parallel and cooperative recognition of facial images for an interactive visual human interface

Author(s):  
O. Hasegawa ◽  
E. Yokosawa ◽  
M. Ishizuka
Author(s):  
Sultan Irsyad Rama Putra
Keyword(s):  

Salah satu upaya untuk mengurangi penularan Coronavirus Disease-19 (Covid-19) adalah dengan melakukan deteksi suhu tubuh dan penggunaan masker. Deteksi suhu tubuh serta masker dilakukan sebagai upaya pengindetifikasi pendeterita Covid-19. Pengecekan suhu tubuh dan masker secara manual dapat mengakibatkan petugas terpapar oleh orang lain ketika melakukan pengecekan suhu dan masker, Tujuan dari penelitian ini merancang human interface sehingga pada saat mendeteksi suhu serta masker dapat mendeteksi secara bersamaan secara otomatis dan real time. Pada pendeteksian suhu menggunakan sensor amg 8833, data hasil deteksi suhu pada Arduino IDE akan di serialkan ke program matlab untuk menampilkan hasil deteksi suhu yang terdapat di program matlab yang di rancang, pada pendeteksian masker menggunakan webcam untuk mengetahui objek menggunakan masker atau tidak. Hasil dari pengujian racangan human interface dari jarak deteksi 50cm dapat menampilkan suhu yang terdeteksi dengan nilai error suhu paling tinggi 1.1°C dan dapat mendeteksi wajah yang menghadap webcam untuk deteksi masker yang akan menampilkan hasilnya di human interface secara bersamaan secara real time dengan dilengkapi speaker yang akan aktif sebagai peringat ketika suhu melebihi 37°C maupun tidak menggunakan masker.


2021 ◽  
pp. 275-284
Author(s):  
Giovanna Castellano ◽  
Berardina De Carolis ◽  
Nicola Marvulli ◽  
Mauro Sciancalepore ◽  
Gennaro Vessio

Sensors ◽  
2021 ◽  
Vol 21 (6) ◽  
pp. 2026
Author(s):  
Jung Hwan Kim ◽  
Alwin Poulose ◽  
Dong Seog Han

Facial emotion recognition (FER) systems play a significant role in identifying driver emotions. Accurate facial emotion recognition of drivers in autonomous vehicles reduces road rage. However, training even the advanced FER model without proper datasets causes poor performance in real-time testing. FER system performance is heavily affected by the quality of datasets than the quality of the algorithms. To improve FER system performance for autonomous vehicles, we propose a facial image threshing (FIT) machine that uses advanced features of pre-trained facial recognition and training from the Xception algorithm. The FIT machine involved removing irrelevant facial images, collecting facial images, correcting misplacing face data, and merging original datasets on a massive scale, in addition to the data-augmentation technique. The final FER results of the proposed method improved the validation accuracy by 16.95% over the conventional approach with the FER 2013 dataset. The confusion matrix evaluation based on the unseen private dataset shows a 5% improvement over the original approach with the FER 2013 dataset to confirm the real-time testing.


mSphere ◽  
2020 ◽  
Vol 5 (1) ◽  
Author(s):  
Benjamin L. Rambo-Martin ◽  
Matthew W. Keller ◽  
Malania M. Wilson ◽  
Jacqueline M. Nolting ◽  
Tavis K. Anderson ◽  
...  

ABSTRACT While working overnight at a swine exhibition, we identified an influenza A virus (IAV) outbreak in swine, Nanopore sequenced 13 IAV genomes from samples we collected, and predicted in real time that these viruses posed a novel risk to humans due to genetic mismatches between the viruses and current prepandemic candidate vaccine viruses (CVVs). We developed and used a portable IAV sequencing and analysis platform called Mia (Mobile Influenza Analysis) to complete and characterize full-length consensus genomes approximately 18 h after unpacking the mobile lab. Exhibition swine are a known source for zoonotic transmission of IAV to humans and pose a potential pandemic risk. Genomic analyses of IAV in swine are critical to understanding this risk, the types of viruses circulating in swine, and whether current vaccines developed for use in humans would be predicted to provide immune protection. Nanopore sequencing technology has enabled genome sequencing in the field at the source of viral outbreaks or at the bedside or pen-side of infected humans and animals. The acquired data, however, have not yet demonstrated real-time, actionable public health responses. The Mia system rapidly identified three genetically distinct swine IAV lineages from three subtypes, A(H1N1), A(H3N2), and A(H1N2). Analysis of the hemagglutinin (HA) sequences of the A(H1N2) viruses identified >30 amino acid differences between the HA1 of these viruses and the most closely related CVV. As an exercise in pandemic preparedness, all sequences were emailed to CDC collaborators who initiated the development of a synthetically derived CVV. IMPORTANCE Swine are influenza virus reservoirs that have caused outbreaks and pandemics. Genomic characterization of these viruses enables pandemic risk assessment and vaccine comparisons, though this typically occurs after a novel swine virus jumps into humans. The greatest risk occurs where large groups of swine and humans comingle. At a large swine exhibition, we used Nanopore sequencing and on-site analytics to interpret 13 swine influenza virus genomes and identified an influenza virus cluster that was genetically highly varied to currently available vaccines. As part of the National Strategy for Pandemic Preparedness exercises, the sequences were emailed to colleagues at the CDC who initiated the development of a synthetically derived vaccine designed to match the viruses at the exhibition. Subsequently, this virus caused 14 infections in humans and was the dominant U.S. variant virus in 2018.


2005 ◽  
Vol 2 (2) ◽  
pp. 97-102 ◽  
Author(s):  
C. DaSalla ◽  
J. Kim ◽  
Y. Koike

The aim of this paper is to design a human–interface system, using EMG signals elicited by various wrist movements, to control a robot. EMG signals are normalized and based on joint torque. A three-layer neural network is used to estimate posture of the wrist and forearm from EMG signals. After training the neural network and obtaining appropriate weights, the subject was able to control the robot in real time using wrist and forearm movements.


Author(s):  
Jeffrey Berkley ◽  
Mark Ganter ◽  
Suzanne Weghorst ◽  
Hayes Gladstone ◽  
Gregory Raugi ◽  
...  

Abstract This paper presents the preliminary results of a new real-time finite element system which supports haptic (i.e. force) feedback to the user. The methodology of the system is based on linear finite-element analysis. Further, this system was originally developed as part of a real-time skin surgery simulator with the Human Interface Technology Lab and, the Division of Dermatology at the University of Washington Medical School. We are currently exploring its use and development as a new engineering design tool.


Sign in / Sign up

Export Citation Format

Share Document