Search Results for author: Stanislas Chambon

Found 4 papers, 3 papers with code

Towards a Flexible Deep Learning Method for Automatic Detection of Clinically Relevant Multi-Modal Events in the Polysomnogram

no code implementations16 May 2019 Alexander Neergaard Olesen, Stanislas Chambon, Valentin Thorey, Poul Jennum, Emmanuel Mignot, Helge B. D. Sorensen

Much attention has been given to automatic sleep staging algorithms in past years, but the detection of discrete events in sleep studies is also crucial for precise characterization of sleep patterns and possible diagnosis of sleep disorders.

Multimodal Sleep Stage Detection Sleep Staging

DOSED: a deep learning approach to detect multiple sleep micro-events in EEG signal

1 code implementation7 Dec 2018 Stanislas Chambon, Valentin Thorey, Pierrick J. Arnal, Emmanuel Mignot, Alexandre Gramfort

The proposed approach, applied here on sleep related micro-architecture events, is inspired by object detectors developed for computer vision such as YOLO and SSD.

EEG K-complex detection +5

A deep learning architecture to detect events in EEG signals during sleep

1 code implementation11 Jul 2018 Stanislas Chambon, Valentin Thorey, Pierrick J. Arnal, Emmanuel Mignot, Alexandre Gramfort

Annotations of such events require a trained sleep expert, a time consuming and tedious process with a large inter-scorer variability.

EEG Event Detection +2

A deep learning architecture for temporal sleep stage classification using multivariate and multimodal time series

1 code implementation5 Jul 2017 Stanislas Chambon, Mathieu Galtier, Pierrick Arnal, Gilles Wainrib, Alexandre Gramfort

We introduce here the first deep learning approach for sleep stage classification that learns end-to-end without computing spectrograms or extracting hand-crafted features, that exploits all multivariate and multimodal Polysomnography (PSG) signals (EEG, EMG and EOG), and that can exploit the temporal context of each 30s window of data.

Classification EEG +3

Cannot find the paper you are looking for? You can Submit a new open access paper.