Unsupervised sleep staging system based on domain adaptation

2021 ◽  
Vol 69 ◽  
pp. 102937
Author(s):  
Ranqi Zhao ◽  
Yi Xia ◽  
Yongliang Zhang
2021 ◽  
Author(s):  
Jiahao Fan ◽  
Hangyu Zhu ◽  
Xinyu Jiang ◽  
Long Meng ◽  
Cong Fu ◽  
...  

Deep sleep staging networks have reached top performance on large-scale datasets. However, these models perform poorer when training and testing on small sleep cohorts due to data inefficiency. Transferring well-trained models from large-scale datasets (source domain) to small sleep cohorts (target domain) is a promising solution but still remains challenging due to the domain-shift issue. In this work, an unsupervised domain adaptation approach, domain statistics alignment (DSA), is developed to bridge the gap between the data distribution of source and target domains. DSA adapts the source models on the target domain by modulating the domain-specific statistics of deep features stored in the Batch Normalization (BN) layers. Furthermore, we have extended DSA by introducing cross-domain statistics in each BN layer to perform DSA adaptively (AdaDSA). The proposed methods merely need the well-trained source model without access to the source data, which may be proprietary and inaccessible. DSA and AdaDSA are universally applicable to various deep sleep staging networks that have BN layers. We have validated the proposed methods by extensive experiments on two state-of-the-art deep sleep staging networks, DeepSleepNet+ and U-time. The performance was evaluated by conducting various transfer tasks on six sleep databases, including two large-scale databases, MASS and SHHS, as the source domain, four small sleep databases as the target domain. Thereinto, clinical sleep records acquired in Huashan Hospital, Shanghai, were used. The results show that both DSA and AdaDSA could significantly improve the performance of source models on target domains, providing novel insights into the domain generalization problem in sleep staging tasks.<br>


2020 ◽  
Vol 21 ◽  
pp. 100453
Author(s):  
Jade Vanbuis ◽  
Mathieu Feuilloy ◽  
Lucile Riaboff ◽  
Guillaume Baffet ◽  
Alain Le Duff ◽  
...  

2020 ◽  
Vol 21 ◽  
pp. 100454
Author(s):  
Jade Vanbuis ◽  
Mathieu Feuilloy ◽  
Guillaume Baffet ◽  
Nicole Meslier ◽  
Frédéric Gagnadoux ◽  
...  

2021 ◽  
pp. 186-202
Author(s):  
Santosh Kumar Satapathy ◽  
Hari Kishan Kondaveeti ◽  
Ravisankar Malladi

SLEEP ◽  
2021 ◽  
Vol 44 (Supplement_2) ◽  
pp. A109-A109
Author(s):  
Pei-Lin Lee ◽  
Nettie Ting ◽  
Yu-Cheng Lin ◽  
Hung-Chih Chiu ◽  
Yu-Ting Liu ◽  
...  

Abstract Introduction The study aims to validate the automatic sleep staging system (ASSS) with photoplethysmography (PPG) and accelerometers embedded in smart watches in community-based population Methods 75 healthy subjects were randomly recruited form 304 staffs in an industrial firm who volunteered for this study. A four-stage classifier was designed based on Linear Discriminant Analysis using PPG and accelerometers. To better validate the system performance, the leave-one-out approach was applied in this study. The performance of ASSS was assessed with the epoch-by-epoch and whole-night agreement for sleep staging against manual scoring of overnight polysomnography. Results The mean agreement of four stages across all subjects was 61.1% (95% CI, 58.9-63.2) with kappa 0.55 (0.52-0.58). The mean agreement for wake, light sleep (LS), deep sleep (DS), and REM was 53.4%, 84.1%, 40.3%, 75.6%, respectively. The whole-night agreement was good-excellent (Intra-class correlation coefficient, 0.74 to 0.84) for total sleep time, sleep efficiency, wake after sleep onset, and duration of wake and REM. The agreement was fair for sleep onset and LS duration, but poor for DS duration. Conclusion Our result showed that PPG and accelerometers based smart watches have proper validity for automatic sleep staging in the community-based population. Support (if any) “Center for electronics technology integration (NTU-107L900502, 108L900502, 109-2314-B-002-252)” from the Featured Areas Research Center Program within the framework of the Higher Education Sprout Project by the Ministry of Education (MOE) in Taiwan; MediaTek Inc (201802034 RIPD).


2021 ◽  
Author(s):  
Jiahao Fan ◽  
Hangyu Zhu ◽  
Xinyu Jiang ◽  
Long Meng ◽  
Cong Fu ◽  
...  

Deep sleep staging networks have reached top performance on large-scale datasets. However, these models perform poorer when training and testing on small sleep cohorts due to data inefficiency. Transferring well-trained models from large-scale datasets (source domain) to small sleep cohorts (target domain) is a promising solution but still remains challenging due to the domain-shift issue. In this work, an unsupervised domain adaptation approach, domain statistics alignment (DSA), is developed to bridge the gap between the data distribution of source and target domains. DSA adapts the source models on the target domain by modulating the domain-specific statistics of deep features stored in the Batch Normalization (BN) layers. Furthermore, we have extended DSA by introducing cross-domain statistics in each BN layer to perform DSA adaptively (AdaDSA). The proposed methods merely need the well-trained source model without access to the source data, which may be proprietary and inaccessible. DSA and AdaDSA are universally applicable to various deep sleep staging networks that have BN layers. We have validated the proposed methods by extensive experiments on two state-of-the-art deep sleep staging networks, DeepSleepNet+ and U-time. The performance was evaluated by conducting various transfer tasks on six sleep databases, including two large-scale databases, MASS and SHHS, as the source domain, four small sleep databases as the target domain. Thereinto, clinical sleep records acquired in Huashan Hospital, Shanghai, were used. The results show that both DSA and AdaDSA could significantly improve the performance of source models on target domains, providing novel insights into the domain generalization problem in sleep staging tasks.<br>


2018 ◽  
Vol 104 ◽  
pp. 277-293 ◽  
Author(s):  
Saman Seifpour ◽  
Hamid Niknazar ◽  
Mohammad Mikaeili ◽  
Ali Motie Nasrabadi

Sign in / Sign up

Export Citation Format

Share Document