breathing motion
Recently Published Documents


TOTAL DOCUMENTS

239
(FIVE YEARS 37)

H-INDEX

23
(FIVE YEARS 2)

Author(s):  
Alexander Curtiss ◽  
Blaine Rothrock ◽  
Abu Bakar ◽  
Nivedita Arora ◽  
Jason Huang ◽  
...  

The COVID-19 pandemic has dramatically increased the use of face masks across the world. Aside from physical distancing, they are among the most effective protection for healthcare workers and the general population. Face masks are passive devices, however, and cannot alert the user in case of improper fit or mask degradation. Additionally, face masks are optimally positioned to give unique insight into some personal health metrics. Recognizing this limitation and opportunity, we present FaceBit: an open-source research platform for smart face mask applications. FaceBit's design was informed by needfinding studies with a cohort of health professionals. Small and easily secured into any face mask, FaceBit is accompanied by a mobile application that provides a user interface and facilitates research. It monitors heart rate without skin contact via ballistocardiography, respiration rate via temperature changes, and mask-fit and wear time from pressure signals, all on-device with an energy-efficient runtime system. FaceBit can harvest energy from breathing, motion, or sunlight to supplement its tiny primary cell battery that alone delivers a battery lifetime of 11 days or more. FaceBit empowers the mobile computing community to jumpstart research in smart face mask sensing and inference, and provides a sustainable, convenient form factor for health management, applicable to COVID-19 frontline workers and beyond.


Author(s):  
Bruno Madore ◽  
Gabriela Belsley ◽  
Cheng-Chieh Cheng ◽  
Frank Preiswerk ◽  
Marie Foley Kijewski ◽  
...  

Abstract Breathing motion can displace internal organs by up to several cm; as such, it is a primary factor limiting image quality in medical imaging. Motion can also complicate matters when trying to fuse images from different modalities, acquired at different locations and/or on different days. Currently available devices for monitoring breathing motion often do so indirectly, by detecting changes in the outline of the torso rather than the internal motion itself, and these devices are often fixed to floors, ceilings or walls, and thus cannot accompany patients from one location to another. We have developed small ultrasound-based sensors, referred to as ‘organ configuration motion’ (OCM) sensors, that attach to the skin and provide rich motion-sensitive information. In the present work we tested the ability of OCM sensors to enable respiratory gating during in vivo PET imaging. A motion phantom involving an FDG solution was assembled, and two cancer patients scheduled for a clinical PET/CT exam were recruited for this study. OCM signals were used to help reconstruct phantom and in vivo data into time series of motion-resolved images. As expected, the motion-resolved images captured the underlying motion. In Patient #1, a single large lesion proved to be mostly stationary through the breathing cycle. However, in Patient #2, several small lesions were mobile during breathing, and our proposed new approach captured their breathing-related displacements. In summary, a relatively inexpensive hardware solution was developed here for respiration monitoring. Because the proposed sensors attach to the skin, as opposed to walls or ceilings, they can accompany patients from one procedure to the next, potentially allowing data gathered in different places and at different times to be combined and compared in ways that account for breathing motion.


2021 ◽  
Vol 2021 ◽  
pp. 1-15
Author(s):  
Pham The Bao ◽  
Hoang Thi Kieu Trang ◽  
Tran Anh Tuan ◽  
Tran Thien Thanh ◽  
Vo Hong Hai

The lung organ of human anatomy captured by a medical device reveals inhalation and exhalation information for treatment and monitoring. Given a large number of slices covering an area of the lung, we have a set of three-dimensional lung data. And then, by combining additionally with breath-hold measurements, we have a dataset of multigroup CT images (called 4DCT image set) that could show the lung motion and deformation over time. Up to now, it has still been a challenging problem to model a respiratory signal representing patients’ breathing motion as well as simulating inhalation and exhalation process from 4DCT lung images because of its complexity. In this paper, we propose a promising hybrid approach incorporating the local binary pattern (LBP) histogram with entropy comparison to register the lung images. The segmentation process of the left and right lung is completely overcome by the minimum variance quantization and within class variance techniques which help the registration stage. The experiments are conducted on the 4DCT deformable image registration (DIR) public database giving us the overall evaluation on each stage: segmentation, registration, and modeling, to validate the effectiveness of the approach.


2021 ◽  
Author(s):  
William B. Ashe ◽  
Sarah E. Innis ◽  
Julia N. Shanno ◽  
Camille J. Hochheimer ◽  
Ronald D. Williams ◽  
...  

AbstractRationaleBreathing motion (respiratory kinematics) can be characterized by the interval and depth of each breath, and by magnitude-synchrony relationships between locations. Such characteristics and their breath-by-breath variability might be useful indicators of respiratory health.ObjectivesTo enable breath-by-breath characterization of respiratory kinematics, we developed a method to detect breaths using motion sensor signals.MethodsIn 34 volunteers who underwent maximal exercise testing, we used 8 motion sensors to record upper rib, lower rib and abdominal kinematics at 3 exercise stages (rest, lactate threshold and exhaustion). We recorded volumetric air flow signals using clinical exercise laboratory equipment and synchronized them with kinematic signals. Using instantaneous phase landmarks from the analytic representation of kinematic and flow signals, we identified individual breaths and derived respiratory rate signals at 1Hz. To evaluate the fidelity of kinematics-derived respiratory rate signals, we calculated their cross-correlation with the flow-derived respiratory rate signals. To identify coupling between kinematics and flow, we calculated the Shannon entropy of the relative frequency with which kinematic phase landmarks were distributed over the phase of the flow cycle.Measurements and Main ResultsWe found good agreement in the kinematics-derived and flow-derived respiratory rate signals, with cross-correlation coefficients as high as 0.94. In some individuals, the kinematics and flow were significantly coupled (Shannon entropy < 2) but the relationship varied within (by exercise stage) and between individuals. The final result was that the phase landmarks from the kinematic signal were uniformly distributed over the phase of the air flow signals (Shannon entropy close to the theoretical maximum of 3.32).ConclusionsThe Analysis of Respiratory Kinematics method can yield highly resolved respiratory rate signals by separating individual breaths. This method will facilitate characterization of clinically significant breathing motion patterns on a breath-by-breath basis. The relationship between respiratory kinematics and flow is much more complex than expected, varying between and within individuals.


2021 ◽  
Vol 161 ◽  
pp. S93-S95
Author(s):  
S. Parveen ◽  
M. Parkes ◽  
E. Wingate ◽  
B. Shingler ◽  
S. Green ◽  
...  

2021 ◽  
Vol 161 ◽  
pp. S664-S666
Author(s):  
C. Stengl ◽  
A. Weidner ◽  
F. Dinkel ◽  
S. Dorsch ◽  
A. Runz ◽  
...  
Keyword(s):  

2021 ◽  
Vol 161 ◽  
pp. S730-S731
Author(s):  
M. Varasteh ◽  
A. Mohammad Ali ◽  
S. Esteve ◽  
F. Göpfert ◽  
P. Jeevanandam ◽  
...  

2021 ◽  
Author(s):  
Shekh Md Mahmudul Islam

<p>Radar sensing of respiratory motion from unmanned aerial vehicles (UAVs) offers great promise for remote life sensing especially in post-disaster search and rescue applications. One major challenge for this technology is the management of motion artifacts from the moving UAV platform. Prior research has focused on using an adaptive filtering approach which requires installing a secondary radar module for capturing platform motion as a noise reference. This paper investigates the potential of the empirical mode decomposition (EMD) technique for the compensation of platform motion artifacts using only primary radar measurements. Experimental results demonstrated that the proposed EMD approach can extract the fundamental frequency of the breathing motion from the combined breathing and platform motion using only one radar, with an accuracy above 87%. </p>


2021 ◽  
Author(s):  
Shekh Md Mahmudul Islam

<p>Radar sensing of respiratory motion from unmanned aerial vehicles (UAVs) offers great promise for remote life sensing especially in post-disaster search and rescue applications. One major challenge for this technology is the management of motion artifacts from the moving UAV platform. Prior research has focused on using an adaptive filtering approach which requires installing a secondary radar module for capturing platform motion as a noise reference. This paper investigates the potential of the empirical mode decomposition (EMD) technique for the compensation of platform motion artifacts using only primary radar measurements. Experimental results demonstrated that the proposed EMD approach can extract the fundamental frequency of the breathing motion from the combined breathing and platform motion using only one radar, with an accuracy above 87%. </p>


Sign in / Sign up

Export Citation Format

Share Document