ccece 2003
play

CCECE 2003 Signal Classification through Multifractal Analysis and - PowerPoint PPT Presentation

CCECE 2003 Signal Classification through Multifractal Analysis and Complex Domain Neural Networks V. Cheung, K. Cannons, W. Kinsner, and J. Pear* Department of Electrical & Computer Engineering Signal and Data Compression Laboratory


  1. CCECE 2003 Signal Classification through Multifractal Analysis and Complex Domain Neural Networks V. Cheung, K. Cannons, W. Kinsner, and J. Pear* Department of Electrical & Computer Engineering Signal and Data Compression Laboratory *Department of Psychology University of Manitoba, Winnipeg, Manitoba, Canada May 5, 2003

  2. CCECE03 Outline ● Introduction ● Background ► Variance fractal dimension trajectory ► Kohonen self-organizing feature map ► Probabilistic neural network ► Complex domain neural network ● Experimental Results and Discussion ● Conclusion 1 / 16 Cheung

  3. CCECE03 Introduction Introduction ● Classification of signals that are: Background ► Stochastic ► Self-affine ► Non-stationary ► Multivariate Results ► From non-linear systems ● Eg. multi-channel speech signals, multi-lead ECGs or EEGs Conclusion 2 / 16 Cheung

  4. CCECE03 Introduction Fish Dishabituation Signals Background Fish Tank Mirror Results Z X Conclusion Y 3 / 16 Cheung

  5. CCECE03 Introduction System Design Signals Background Feature Extraction VFDT Results SOFM Classification Conclusion PNN CNN Signal Classification 4 / 16 Cheung

  6. CCECE03 Introduction Variance Fractal Dimension Trajectory ● Temporal multifractal characterization Background ► Calculate the variance fractal dimension of a small segment of the signal in a sliding-window fashion over the entire signal [Kins94] ► Reveals the underlying complexity of the signal Results ► Provides a normalizing effect ● Advantages of the variance fractal dimension ► Easy to compute Conclusion ■ Measure the variance of amplitude increments at different scales ► Can be computed in real-time 5 / 16 Cheung

  7. Conclusion Results Background Introduction 6 / 16 VFDT Plot VFDT CCECE03 Cheung

  8. CCECE03 Introduction Self-Organizing Feature Maps (SOFM) ● Topology-preserving neural networks using Background competitive unsupervised learning [Koho84] ● Two uses in this paper ► Clustering Results ■ Aid in constructing the training and testing sets ► Feature Extraction ■ Dimensionality reduction Conclusion 7 / 16 Cheung

  9. CCECE03 Introduction Probabilistic Neural Networks ● Neural network implementation of the Bayes Background optimal decision rule [Spec88] ► eg. Spam filters ● Advantages Results ► Asympotically Bayes optimal ■ Good classifiers ► Trains orders of magnitude faster than other NNs Conclusion ● Disadvantages ► Slower execution than other NNs ► Require large amounts of memory 8 / 16 Cheung

  10. CCECE03 Introduction Complex Domain Neural Networks (CNN) ● Advantages Background ► Works with inputs in their natural complex valued form ► Faster training ► Better generalization Results ● Disadvantages ► More complexity Conclusion ■ Convoluted partial derivatives involving complex analysis ● Ref: [Mast94] 9 / 16 Cheung

  11. CCECE03 Introduction CNN Architecture Background Input Hidden Output Layer Layer Layer Results 1 2 Conclusion 3 10 / 16 Cheung

  12. CCECE03 Introduction Experiment #1 X-Axis Background VFDT PNN Classification Signals Experimental Correct Class. Rate 1 2 3 4 Results 1 24 0 0 0 100.00% Expected 2 3 135 4 4 92.47% 3 0 12 111 65 59.04% Conclusion 4 0 23 70 93 50.00% Average Correct Classification Rate: 66.73% 95% Confidence Interval: [62.77%, 70.69%] 11 / 16 Cheung

  13. CCECE03 Introduction Experimental Results Summary Background Classification Rate (%) Average Signal Classifier Rate (%) 1 2 3 4 X PNN 100 92 59 50 67 Z PNN 63 29 47 91 58 Results X & Z PNN 100 95 84 95 91 X & Z CNN 96 87 80 91 87 Conclusion X-Axis VFDT PNN Signals Z-Axis VFDT CNN Signals 12 / 16 Cheung

  14. CCECE03 Introduction SOFM Feature Extraction Background X-Axis VFDT PNN Signals SOFM Z-Axis VFDT CNN Signals Results Average Rate (%) Signal Classifier VFDT + SOFM X PNN 67 66 Z PNN 58 61 Conclusion X & Z PNN 91 88 X & Z CNN 87 85 13 / 16 Cheung

  15. CCECE03 Introduction Conclusions ● A system capable of classifying self-affine, Background stochastic, non-stationary, multivariate signals originating from non-linear processes was developed ● Feature extraction involving variance fractal Results dimensions and self-organizing feature maps shown to be effective Conclusion ● Probabilistic neural networks and complex domain neural networks shown to be capable of performing the desired classification 14 / 16 Cheung

  16. CCECE03 Acknowledgements ● Natural Sciences and Engineering Research Council (NSERC) of Canada ● University of Manitoba 15 / 16 Cheung

  17. CCECE03 References [ChCa03] V. Cheung and K. Cannons, Signal Classification through Multifractal Analysis and Neural Networks . BSc Thesis. Dept. of Electrical and Computer Engineering, University of Manitoba, Winnipeg, MB, 106 pp., 2003. [Kins94] W. Kinsner, “Batch and real-time computation of a fractal dimension based on variance of a time series,” Technical Report, DEL94-6; UofM; June 15, 1994, (v+17) 22 pp. [Koho84] T. Kohonen, Self-Organization and Associative Memory. Berlin: Springer-Verlag, 1984. [Mast94] T. Masters, Signal and Image Processing with Neural Networks: A C++ Sourcebook . New York, NY: John Wiley & Sons, Inc., 1994. [Spec88] D.F. Specht, “Probabilistic neural networks for classification, mapping, or associative memory”, IEEE International Conference on Neural Networks , vol. 1, pp. 525-532, July 1988. 16 / 16 Cheung

Recommend


More recommend