Sparse Representation-Based Classification (SRC): A Novel Method to Improve the Performance of Convolutional Neural Networks in Detecting P300 Signals

Document Type : Articles

Authors

Abstract

Introduction: Brain-Computer Interface (BCI) offers a non-muscle way between the humanbrain and the outside world to make a better life for disabled people. In BCI applicationsP300 signal has an effective role; therefore, distinguishing P300 and non-P300 componentsin EEG signal (i.e. P300 detection) becomes a vital problem in BCI applications. Recently,Convolutional Neural Networks (CNNs) have had a significant application in detection ofP300 signals in the field of BCIs. The P300 signal has low Signal to Noise Ratio (SNR). Onthe other hand, the CNN detection rate is so sensitive to SNR; therefore, CNN detection ratedrops dramatically when it is faces with P300 data. In this study, a novel structure is proposed to improve the performance of CNN in P300 signal detection by means of improving its performance against low SNR signals.Methods: In the proposed structure, Sparse Representation-based Classification (SRC) wasused as the first substructure. This block is responsible for prediction of the expected P300signal among artifacts and noise. The second substructure performed P300 classification with Adadelta algorithm. Thanks to such SNR improvement scheme; the proposed structure i able to increase the rate of accuracy in the field of P300 signal detection.Results: To evaluate the performance of the proposed structure, we applied it on EPFLdataset for P300 detection, and then the achieved results were compared with those obtained from the basic CNN structure. The comparisons revealed the superiority of the proposed structure against its alternative, so that its True Positive Rate (TPR) was promoted about 19.66%. Such improvements for false detections and accuracy parameters were 1.93% and 10.46%, respectively, which show the effectiveness of applying the proposed structure in detecting P300 signals.Conclusion: The better accuracy of the proposed algorithm compared to basic CNN, inparallel with its more robustness, showed that the Sparse Representation-based Classification (SRC) had a considerable potential to be used as an improving idea in CNN-based P300 detection.Keywords: EEG, Neural Networks, Signal Detection, Machine Learning, Brain-ComputerInterfaces, Brain-Computer Interface, Brain, Neuroscience, P300, Convolutional NeuralNetworks, Deep Learning

  1. Espinosa R. Increased Signal to Noise Ratio in P300 Potentials by the Method of Coherent Self-Averaging in BCI Systems. International
  2. Scholarly and Scientific Research & Innovation. 2013;7(11):386-90.
  3. Cecotti H, Graser A. Convolutional neural networks for P300 detection with application to brain-computer interfaces. IEEE Trans Pattern Anal Mach Intell. 2011;33(3):433-45. doi: 10.1109/ TPAMI.2010.125.
  4. da Silva-Sauer L, Valero-Aguayo L, de la Torre- Luque A, Ron-Angevin R, Varona-Moya S. Concentration on performance with P300-based BCI systems: a matter of interface features.
  5. Appl Ergon. 2016;52:325-32. doi: 10.1016/j. apergo.2015.08.002.
  6. Mubeen MA, Knuth KH. Evidence-Based Filters for Signal Detection: Application to Evoked Brain Responses. arXiv preprint arXiv:11071257. 2011.
  7. Alvarado-Gonzalez M, Garduno E, Bribiesca E, Yanez-Suarez O, Medina-Banuelos V. P300 Detection Based on EEG Shape Features. Comput Math Methods Med. 2016;2016:2029791. doi:
  8. 1155/2016/2029791.
  9. Donchin E, Spencer KM, Wijesinghe R. The mental prosthesis: assessing the speed of a P300-based brain-computer interface.
  10. IEEE Trans Rehabil Eng. 2000;8(2):174-9.doi: 10.1109/86.847808.
  11. Vareka L, Mautner P, editors. Using the Windowed means paradigm for single trial P300 detection. 2015 38th International Conference
  12. on Telecommunications and Signal Processing (TSP); 2015: IEEE.doi: 10.1109/tsp.2015.7296414.
  13. Hutagalung SS, Turnip A, Munandar A, editors. P300 detection based on extraction and classification in online BCI. 2013 3rd
  14. International Conference on Instrumentation Control and Automation (ICA); 2013: IEEE.doi: 10.1109/ica.2013.6734042.
  15. Sobhani A, editor P300 classification using deep belief nets. European Symposium on Artificial Neural Networks (ESANN); 2014.
  16. Magee R, Givigi S, editors. A genetic algorithm for single-trial P300 detection with a lowcost EEG headset. 2015 Annual IEEE Systems
  17. Conference (SysCon) Proceedings; 2015. doi:
  18. 1109/syscon.2015.7116757.
  19. Goodfellow I, Bengio Y, Courville A. Deep learning. Cambridge: MIT press; Vol. 1. 2016.
  20. Kawaguchi K, editor Deep learning without poor local minima. Adv Neural Inf Process Syst; 2016. (pp. 586-594).
  21. Zhang J, Yan C, Gong X, editors. Deep convolutional neural network for decoding motor imagery based brain computer interface. 2017 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC); 2017: IEEE. [1] (pp. 1 5).doi: 10.1109/ icspcc.2017.8242581.
  22. Maddula R, Stivers J, Mousavi M, Ravindran S, de Sa V, editors. Deep recurrent convolutional neural networks for classifying P300 BCI signals. Proceedings of the Graz BCI Conference; 2017.
  23. Ge R, Huang F, Jin C, Yuan Y, editors. Escaping from saddle points—online stochastic gradient for tensor decomposition. Conference on Learning Theory; 2015. (pp. 797-842).
  24. Geraci JR, Kapoor P. A method of limiting performance loss of CNNs in noisy environments. arXiv preprint arXiv:170200932. 2017.
  25. Hoffmann U, Vesin JM, Ebrahimi T, Diserens K. An efficient P300-based brain-computer interface for disabled subjects. J Neurosci
  26. Methods. 2008;167(1):115-25. doi: 10.1016/j. jneumeth.2007.03.005.
  27. LeCun Y [Internet]. LeNet-5, convolutional neural networks. c2015. Available from: URL: http://yann lecun com/exdb/lenet
  28. Zhang W, editor Shift-invariant pattern recognition neural network and its optical architecture/Wei Zhang. Proceedings of annual conference of the Japan Society of Applied Physics–1988.
  29. Zhang W, Itoh K, Tanida J, Ichioka Y. Parallel
  30. distributed processing model with local spaceinvariant interconnections and its optical architecture. Applied optics. 1990;29(32):4790-7. doi: 10.1364/AO.29.004790.
  31. Matsugu M, Mori K, Mitari Y, Kaneda Y. Subject independent facial expression recognition with robust face detection using a convolutional neural network. Neural networks : the official journal of the International Neural Network Society. 2003;16(5-6):555-9. doi: 10.1016/S0893- 6080(03)00115-1.
  32. Karpathy A. Cs231n convolutional neural networks for visual recognition. Neural networks. 2016;1.
  33. Vu TH, Nguyen L, Le C, Monga V, editors. Tensor sparsity for classifying low-frequency ultra-wideband (UWB) SAR imagery. 2017 IEEE Radar Conference (RadarConf); 2017: IEEE. (pp. 0557-0562).doi: 10.1109/radar.2017.7944265.
  34. Nguyen LH, Kappra KA, Wong DC, Kapoor R, Sichina J, editors. Mine field detection algorithm utilizing data from an ultrawideband wide-area surveillance radar. Detection and Remediation Technologies for Mines and Minelike Targets III; 1998: International Society for Optics and Photonics. (Vol. 3392, pp. 627-644).
  35. Stutz D. Understanding convolutional neural networks. InSeminar Report, Fakultät für Mathematik, Informatik und Naturwissenschaften Lehr-und Forschungsgebiet Informatik VIII Computer Vision. 2014.
  36. Ruder S. An overview of gradient descent
  37. optimization algorithms. arXiv preprint
  38. arXiv:160904747. 2016.
  39. Kundu S, Ari S, editors. P300 Detection Using Ensemble of SVM for Brain-Computer Interface Application. 2018 9th International Conference on Computing, Communication and Networking
  40. Technologies (ICCCNT); 2018: IEEE.doi: 10.1109/ icccnt.2018.8493903.
  41. Manyakov NV, Chumerin N, Combaz A, Van Hulle MM. Comparison of classification methods for P300 brain-computer interface on disabled subjects. Comput Intell Neurosci. 2011;2011:519868. doi: 10.1155/2011/51986.
  42. Cecotti H, Gräser A, editors. Neural network pruning for feature selection-Application to a P300 Brain-Computer Interface. ESANN; 2009.