Date of Award

Summer 2021

Document Type


Degree Name

Doctor of Philosophy (PhD)


Computer Science

First Advisor

Ahamed, Sheikh

Second Advisor

Smith, Roger

Third Advisor

Madiraju, Praveen

Fourth Advisor

Yahyasoltani, Nasim


Motor imagery (MI) has been one of the most used paradigms for building brain-computer interfaces (BCI), widely used in neurorehabilitation, for restoring functionality to damaged parts of a neurologically deficient person. The existing motor imagery techniques have largely employed feature extraction techniques such as the power spectral density (PSD) and the common spatial patterns (CSP) before classification, using traditional machine learning algorithms such as support vector machines (SVM) and linear discriminant analysis (LDA). These algorithms are quite limited in their ability to generate feature representations for certain types of signals, limiting the potential for improvements in the decoding process. Also, the problem of signal non-stationarity inherent in many neurological studies typically requires calibrating a classifier for each subject and session of the study, tantamount to longer wait times and possibly rendering the BCI impractical to use. Another challenge with current decoding techniques is the inadequacy in interpretations and verification of the learnings of the classifiers. This work focuses on using more recent deep learning-based techniques with data augmentation for the decoding process, transfer learning to handle the non-stationarity across subjects and sessions and interpretability, validating the model’s learnings. Using public motor imagery electroencephalography (EEG) datasets, we demonstrate that specific feature engineering is not needed with deep learning-based techniques and that transfer learning can be used to handle the intra- and inter-subject non-stationarity of signals. Furthermore, we provide interpretations using region contribution analysis, a framework based on occlusion sensitivity, to show that the model learnt task-relevant features.