TY - JOUR
T1 - L-SeqSleepNet
T2 - Whole-cycle Long Sequence Modeling for Automatic Sleep Staging
AU - Phan, Huy
AU - Lorenzen, Kristian P.
AU - Heremans, Elisabeth
AU - Chen, Oliver Y.
AU - Tran, Minh C.
AU - Koch, Philipp
AU - Mertins, Alfred
AU - Baumert, Mathias
AU - Mikkelsen, Kaare B.
AU - De Vos, Maarten
N1 - Publisher Copyright:
© 2013 IEEE.
PY - 2023/10
Y1 - 2023/10
N2 - Human sleep is cyclical with a period of approximately 90 minutes, implying long temporal dependency in the sleep data. Yet, exploring this long-term dependency when developing sleep staging models has remained untouched. In this work, we show that while encoding the logic of a whole sleep cycle is crucial to improve sleep staging performance, the sequential modelling approach in existing state-of-the-art deep learning models are inefficient for that purpose. We thus introduce a method for efficient long sequence modelling and propose a new deep learning model, L-SeqSleepNet, which takes into account whole-cycle sleep information for sleep staging. Evaluating L-SeqSleepNet on four distinct databases of various sizes, we demonstrate state-of-the-art performance obtained by the model over three different EEG setups, including scalp EEG in conventional Polysomnography (PSG), in-ear EEG, and around-the-ear EEG (cEEGrid), even with a single EEG channel input. Our analyses also show that L-SeqSleepNet is able to alleviate the predominance of N2 sleep (the major class in terms of classification) to bring down errors in other sleep stages. Moreover the network becomes much more robust, meaning that for all subjects where the baseline method had exceptionally poor performance, their performance are improved significantly. Finally, the computation time only grows at a sub-linear rate when the sequence length increases.
AB - Human sleep is cyclical with a period of approximately 90 minutes, implying long temporal dependency in the sleep data. Yet, exploring this long-term dependency when developing sleep staging models has remained untouched. In this work, we show that while encoding the logic of a whole sleep cycle is crucial to improve sleep staging performance, the sequential modelling approach in existing state-of-the-art deep learning models are inefficient for that purpose. We thus introduce a method for efficient long sequence modelling and propose a new deep learning model, L-SeqSleepNet, which takes into account whole-cycle sleep information for sleep staging. Evaluating L-SeqSleepNet on four distinct databases of various sizes, we demonstrate state-of-the-art performance obtained by the model over three different EEG setups, including scalp EEG in conventional Polysomnography (PSG), in-ear EEG, and around-the-ear EEG (cEEGrid), even with a single EEG channel input. Our analyses also show that L-SeqSleepNet is able to alleviate the predominance of N2 sleep (the major class in terms of classification) to bring down errors in other sleep stages. Moreover the network becomes much more robust, meaning that for all subjects where the baseline method had exceptionally poor performance, their performance are improved significantly. Finally, the computation time only grows at a sub-linear rate when the sequence length increases.
KW - Automatic sleep staging
KW - deep neural network
KW - long sequence modelling
KW - sequence-to-sequence
UR - http://www.scopus.com/inward/record.url?scp=85167803048&partnerID=8YFLogxK
U2 - 10.1109/JBHI.2023.3303197
DO - 10.1109/JBHI.2023.3303197
M3 - Journal article
C2 - 37552591
AN - SCOPUS:85167803048
SN - 2168-2194
VL - 27
SP - 4748
EP - 4757
JO - IEEE Journal of Biomedical and Health Informatics
JF - IEEE Journal of Biomedical and Health Informatics
IS - 10
ER -