Document Type
Article
Publication Date
8-2022
Publisher
Elsevier
Source Publication
Heliyon
Source ISSN
2405-8440
Original Item ID
DOI: 10.1016/j.heliyon.2022.e10240
Abstract
The wide use of motor imagery as a paradigm for brain-computer interfacing (BCI) points to its characteristic ability to generate discriminatory signals for communication and control. In recent times, deep learning techniques have increasingly been explored, in motor imagery decoding. While deep learning techniques are promising, a major challenge limiting their wide adoption is the amount of data available for decoding. To combat this challenge, data augmentation can be performed, to enhance decoding performance. In this study, we performed data augmentation by synthesizing motor imagery (MI) electroencephalography (EEG) trials, following six approaches. Data generated using these methods were evaluated based on four criteria, namely – the accuracy of prediction, the Frechet Inception distance (FID), the t-distributed Stochastic Neighbour Embedding (t-SNE) plots and topographic head plots. We show, based on these, that the synthesized data exhibit similar characteristics with real data, gaining up to 3% and 12% increases in mean accuracies across two public datasets. Finally, we believe these approaches should be utilized in applying deep learning techniques, as they not only have the potential to improve prediction performances, but also to save time spent on subject data collection.
Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.
Recommended Citation
George, Olawunmi; Smith, Roger; Madiraju, Praveen; Yahyasoltani, Nasim; and Ahamed, Sheikh Iqbal, "Data Augmentation Strategies for EEG-Based Motor Imagery Decoding" (2022). Computer Science Faculty Research and Publications. 86.
https://epublications.marquette.edu/comp_fac/86
Comments
Published version. Heliyon, Vol. 8, No. 8 (August 2022). DOI. © The Authors. Used with permission.
This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/bync-nd/4.0/).