Personalized automatic sleep staging with single-night data: A pilot study with Kullback-Leibler divergence regularization

Publikation: Bidrag til tidsskrift/Konferencebidrag i tidsskrift /Bidrag til avisTidsskriftartikelForskningpeer review

Standard

Personalized automatic sleep staging with single-night data : A pilot study with Kullback-Leibler divergence regularization. / Phan, Huy; Mikkelsen, Kaare; Chén, Oliver Y.; Koch, Philipp; Mertins, Alfred; Kidmose, Preben; De Vos, Maarten.

I: Physiological Measurement, Bind 41, Nr. 6, 064004, 2020.

Publikation: Bidrag til tidsskrift/Konferencebidrag i tidsskrift /Bidrag til avisTidsskriftartikelForskningpeer review

Harvard

APA

CBE

MLA

Vancouver

Author

Phan, Huy ; Mikkelsen, Kaare ; Chén, Oliver Y. ; Koch, Philipp ; Mertins, Alfred ; Kidmose, Preben ; De Vos, Maarten. / Personalized automatic sleep staging with single-night data : A pilot study with Kullback-Leibler divergence regularization. I: Physiological Measurement. 2020 ; Bind 41, Nr. 6.

Bibtex

@article{452668944c8d42978db067e03c499ca2,
title = "Personalized automatic sleep staging with single-night data: A pilot study with Kullback-Leibler divergence regularization",
abstract = "Objective: Brain waves vary between people. This work aims to improve automatic sleep staging for longitudinal sleep monitoring via personalization of algorithms based on individual characteristics extracted from sleep data recorded during the first night. Approach: As data from a single night are very small, thereby making model training difficult, we propose a Kullback-Leibler (KL) divergence regularized transfer learning approach to address this problem. We employ the pretrained SeqSleepNet (i.e. the subject independent model) as a starting point and finetune it with the single-night personalization data to derive the personalized model. This is done by adding the KL divergence between the output of the subject independent model and it of the personalized model to the loss function during finetuning. In effect, KL-divergence regularization prevents the personalized model from overfitting to the single-night data and straying too far away from the subject independent model. Main results: Experimental results on the Sleep-EDF Expanded database consisting of 75 subjects show that sleep staging personalization with single-night data is possible with help of the proposed KL-divergence regularization. On average, we achieve a personalized sleep staging accuracy of 79.6%, a Cohen's kappa of 0.706, a macro F1-score of 73.0%, a sensitivity of 71.8%, and a specificity of 94.2%. Significance: We find both that the approach is robust against overfitting and that it improves the accuracy by 4.5 percentage points compared to the baseline method without personalization and 2.2 percentage points compared to it with personalization but without regularization.",
keywords = "automatic sleep staging, KL-divergence regularization, personalization, single-night data, transfer learning",
author = "Huy Phan and Kaare Mikkelsen and Ch{\'e}n, {Oliver Y.} and Philipp Koch and Alfred Mertins and Preben Kidmose and {De Vos}, Maarten",
year = "2020",
doi = "10.1088/1361-6579/ab921e",
language = "English",
volume = "41",
journal = "Physiological Measurement",
issn = "0967-3334",
publisher = "Institute of Physics Publishing Ltd.",
number = "6",

}

RIS

TY - JOUR

T1 - Personalized automatic sleep staging with single-night data

T2 - A pilot study with Kullback-Leibler divergence regularization

AU - Phan, Huy

AU - Mikkelsen, Kaare

AU - Chén, Oliver Y.

AU - Koch, Philipp

AU - Mertins, Alfred

AU - Kidmose, Preben

AU - De Vos, Maarten

PY - 2020

Y1 - 2020

N2 - Objective: Brain waves vary between people. This work aims to improve automatic sleep staging for longitudinal sleep monitoring via personalization of algorithms based on individual characteristics extracted from sleep data recorded during the first night. Approach: As data from a single night are very small, thereby making model training difficult, we propose a Kullback-Leibler (KL) divergence regularized transfer learning approach to address this problem. We employ the pretrained SeqSleepNet (i.e. the subject independent model) as a starting point and finetune it with the single-night personalization data to derive the personalized model. This is done by adding the KL divergence between the output of the subject independent model and it of the personalized model to the loss function during finetuning. In effect, KL-divergence regularization prevents the personalized model from overfitting to the single-night data and straying too far away from the subject independent model. Main results: Experimental results on the Sleep-EDF Expanded database consisting of 75 subjects show that sleep staging personalization with single-night data is possible with help of the proposed KL-divergence regularization. On average, we achieve a personalized sleep staging accuracy of 79.6%, a Cohen's kappa of 0.706, a macro F1-score of 73.0%, a sensitivity of 71.8%, and a specificity of 94.2%. Significance: We find both that the approach is robust against overfitting and that it improves the accuracy by 4.5 percentage points compared to the baseline method without personalization and 2.2 percentage points compared to it with personalization but without regularization.

AB - Objective: Brain waves vary between people. This work aims to improve automatic sleep staging for longitudinal sleep monitoring via personalization of algorithms based on individual characteristics extracted from sleep data recorded during the first night. Approach: As data from a single night are very small, thereby making model training difficult, we propose a Kullback-Leibler (KL) divergence regularized transfer learning approach to address this problem. We employ the pretrained SeqSleepNet (i.e. the subject independent model) as a starting point and finetune it with the single-night personalization data to derive the personalized model. This is done by adding the KL divergence between the output of the subject independent model and it of the personalized model to the loss function during finetuning. In effect, KL-divergence regularization prevents the personalized model from overfitting to the single-night data and straying too far away from the subject independent model. Main results: Experimental results on the Sleep-EDF Expanded database consisting of 75 subjects show that sleep staging personalization with single-night data is possible with help of the proposed KL-divergence regularization. On average, we achieve a personalized sleep staging accuracy of 79.6%, a Cohen's kappa of 0.706, a macro F1-score of 73.0%, a sensitivity of 71.8%, and a specificity of 94.2%. Significance: We find both that the approach is robust against overfitting and that it improves the accuracy by 4.5 percentage points compared to the baseline method without personalization and 2.2 percentage points compared to it with personalization but without regularization.

KW - automatic sleep staging

KW - KL-divergence regularization

KW - personalization

KW - single-night data

KW - transfer learning

UR - http://www.scopus.com/inward/record.url?scp=85087530738&partnerID=8YFLogxK

U2 - 10.1088/1361-6579/ab921e

DO - 10.1088/1361-6579/ab921e

M3 - Journal article

C2 - 32392550

AN - SCOPUS:85087530738

VL - 41

JO - Physiological Measurement

JF - Physiological Measurement

SN - 0967-3334

IS - 6

M1 - 064004

ER -