Sci Rep. 2025 Apr 3;15(1):11379. doi: 10.1038/s41598-025-96052-0.
ABSTRACT
Classical approaches to diagnosis frequently rely on self-reported symptoms or clinician observations, which can make it difficult to examine mental health illnesses due to their subjective and complicated nature. In this work, we offer an innovative methodology for predicting mental illnesses such as epilepsy, sleep disorders, bipolar disorder, eating disorders, and depression using a multimodal deep learning framework that integrates neurocardiac data fusion. The proposed framework combines MEG, EEG, and ECG signals to create a more comprehensive understanding of brain and cardiac function in individuals with mental disorders. The multimodal deep learning approach uses an integrated CNN-Bi-Transformer, i.e., CardioNeuroFusionNet, which can process multiple types of inputs simultaneously, allowing for the fusion of various modalities and improving the performance of the predictive representation. The proposed framework has undergone testing on data from the Deep BCI Scalp Database and was further validated on the Kymata Atlas dataset to assess its generalizability. The model achieved promising results with high accuracy (98.54%) and sensitivity (97.77%) in predicting mental problems, including neurological and psychiatric conditions. The neurocardiac data fusion has been found to provide additional insights into the relationship between brain and cardiac function in neurological conditions, which could potentially lead to more accurate diagnosis and personalized treatment options. The suggested method overcomes the shortcomings of earlier studies, which tended to concentrate on single-modality data, lacked thorough neurocardiac data fusion, and made use of less advanced machine learning algorithms. The comprehensive experimental findings, which provide an average improvement in accuracy of 2.72%, demonstrate that the suggested work performs better than other cutting-edge AI techniques and generalizes effectively across diverse datasets.
PMID:40181122 | DOI:10.1038/s41598-025-96052-0
Recent Comments