Classification of Autism Spectrum Disorder Using Supervised Learning of Brain Connectivity Measures Extracted from Synchrostates

Classification of Autism Spectrum Disorder Using Supervised Learning of   Brain Connectivity Measures Extracted from Synchrostates

Objective. The paper investigates the presence of autism using the functional brain connectivity measures derived from electro-encephalogram (EEG) of children during face perception tasks. Approach. Phase synchronized patterns from 128-channel EEG signals are obtained for typical children and children with autism spectrum disorder (ASD). The phase synchronized states or synchrostates temporally switch amongst themselves as an underlying process for the completion of a particular cognitive task. We used 12 subjects in each group (ASD and typical) for analyzing their EEG while processing fearful, happy and neutral faces. The minimal and maximally occurring synchrostates for each subject are chosen for extraction of brain connectivity features, which are used for classification between these two groups of subjects. Among different supervised learning techniques, we here explored the discriminant analysis and support vector machine both with polynomial kernels for the classification task. Main results. The leave one out cross-validation of the classification algorithm gives 94.7% accuracy as the best performance with corresponding sensitivity and specificity values as 85.7% and 100% respectively. Significance. The proposed method gives high classification accuracies and outperforms other contemporary research results. The effectiveness of the proposed method for classification of autistic and typical children suggests the possibility of using it on a larger population to validate it for clinical practice.


💡 Research Summary

The study proposes a novel EEG‑based framework for distinguishing children with autism spectrum disorder (ASD) from typically developing (TD) peers by exploiting brain connectivity patterns derived from “synchrostates.” Twenty‑four participants (12 ASD, 12 TD) performed a face‑perception task involving fearful, happy, and neutral expressions while their neural activity was recorded with a 128‑channel EEG system. After band‑pass filtering to the high‑gamma range (30–45 Hz), the authors computed instantaneous phase relationships across channels and applied clustering to identify recurrent phase‑synchronization configurations, termed synchrostates. For each subject the least frequent and the most frequent synchrostate were selected, and functional connectivity matrices were constructed from the phase‑difference values at those moments. Graph‑theoretic metrics (node strength, clustering coefficient, global efficiency) together with channel‑wise power spectral features formed a high‑dimensional feature vector. No dimensionality reduction was performed; the vectors were fed directly into two supervised classifiers: Linear Discriminant Analysis (LDA) and a Support Vector Machine (SVM) with a polynomial kernel. Model performance was assessed using Leave‑One‑Out cross‑validation. The SVM achieved the best results: 94.7 % overall accuracy, 85.7 % sensitivity, and 100 % specificity, surpassing previously reported EEG‑based ASD classification rates (typically 70–85 %). The authors argue that synchrostate‑based connectivity captures richer spatio‑temporal information than conventional power‑only approaches, thereby improving discriminative power. Limitations include the modest sample size, the manual selection of synchrostates, and the lack of separate analysis for each emotional condition. Future work is suggested to involve larger, more diverse cohorts, automated synchrostate detection, and integration with real‑time brain‑computer interfaces to move toward clinically viable early‑diagnostic tools.