automated sleep scoring using unsupervised learning of
play

Automated sleep scoring using unsupervised learning of - PowerPoint PPT Presentation

Automated sleep scoring using unsupervised learning of meta-features DD221X: Degree project in Computer Science May 3rd 2016 Sebastian Olsson What is sleep scoring? Judgments about a sleeping individual Sleep stages: REM, N1, N2,


  1. Automated sleep scoring using unsupervised learning of meta-features DD221X: Degree project in Computer Science May 3rd 2016 Sebastian Olsson

  2. What is sleep scoring? Judgments about a sleeping individual ● ● Sleep stages: REM, N1, N2, N3, Awake

  3. Hypnogram Sleep stage graph ●

  4. Electroencephalogram (EEG) ●

  5. Electroencephalogram (EEG)

  6. Electroencephalogram (EEG) N3

  7. Electroencephalogram (EEG) N3 N1

  8. Electroencephalogram (EEG) N3 N1 REM

  9. Electroencephalogram (EEG) N3 N1 REM Awake

  10. Automated sleep stage scoring

  11. Automated sleep stage scoring

  12. Automated sleep stage scoring Compare 100 % agreement

  13. A Problem statement

  14. B Problem statement

  15. B Problem statement Deep belief net (DBN) ● ● Compare approaches

  16. Method

  17. Data SHHS1 ● ● 10 records ● Annotations

  18. Segmentation 30 s

  19. Feature extraction Mean Variance (12, 0.8) (15, 0.3)

  20. Features Mean ● ● Variance ● Skewness Kurtosis ● Hjorth mobility ● ● Hjorth complexity ● Amplitude

  21. Partitioning 75 % training set ● ● 25 % test set

  22. Feature selection Find a decent combination of features ● ● Strip away unwanted features ○ Curse of dimensionality Inspired by Löfhede [1] ● ● Genetic algorithm ○ Roulette-wheel selection ○ Mutation rate: 0.2 ○ Crossover rate: 1.0 ○ Number of generations: 5 ○ Population size: 5 ○ Chromosome length: 7 ● “Cross-validation”

  23. Feature classification Support vector machine (SVM) ● ○ Linear kernel Radial basis function (RBF) kernel ○ ● Trained using the training set ● Evaluated using the test set

  24. Unsupervised DBN processing DBN ● ○ Two stacked Restricted Boltzmann machines (RBM) Based on Längkvist [2] ● 1. Pre-training 2. Unsupervised fine-tuning with backpropagation 3. Propagate the feature space through the network

  25. Unsupervised DBN processing Three meta-features ● ● Appended to vector: (x 1 , ..., x 7 ) ↦ (x 1 , …, x 7 , m 1 , m 2 , m 3 )

  26. Evaluation 10 ∙ 3 ∙ 2 ∙ 2 = 120 evaluations ● with/without # records DBN # re-runs linear/RBF kernel

  27. Results

  28. Results Scorer A, linear kernel Scorer B, linear kernel Scorer A, RBF kernel Scorer B, RBF kernel

  29. Results

  30. Results

  31. Conclusion Unsupervised DBN processing did not help ● ● Effect too small to be noticable ● Approach too specific to arrive at a general conclusion

  32. Future work Simplify ● ○ Skip feature selection Append/replace ● Try different parameters, e.g. ● ○ Number of meta-features (output nodes) Number of RBMs ○ ○ Number of hidden layer units Epochs ○ ○ Initial biases

  33. References [1] J. Löfhede. (2009). The EEG of the neonatal Brain - Classification of ● Background Activity . ● [2] M. Längkvist. (2012). Sleep Stage Classification Using Unsupervised Feature Learning .

  34. Images ● Licensed under CC BY-SA 3.0: ○ https://commons.wikimedia.org/wiki/File:HYPNOGRAM_created_by_Natasha_k.jpg ● Licensed under CC BY-SA 4.0: ○ https://commons.wikimedia.org/wiki/File:Sleep_scoring.png ● Public domain: ○ https://commons.wikimedia.org/wiki/File:1st-eeg.png ○ https://pixabay.com/en/scientist-professor-man-researcher-28748/ ○ https://pixabay.com/en/artificial-intelligence-155161/ ● Licensed under CC0 1.0: ○ All remaining images

  35. The End

Recommend


More recommend