News
Article
Author(s):
If integrated into clinical practice, the approach could fill a need for more optimal diagnostic approaches for myasthenia gravis (MG), explained the researchers.
Researchers have showed that combining infrared spectroscopy with artificial intelligence (AI) could potentially improve myasthenia gravis (MG) diagnosis by reducing diagnostic delays and helping connect patients with the right treatment. If integrated into clinical practice, the approach could fill a need for more optimal diagnostic approaches for MG, explained the researchers, whose findings were published in Scientific Reports.1
Current approaches to diagnosis are stimied by low accuracy, false positives, low sensitivity and specificity, and availability and reliability issues.2
“Despite evaluations like the edrophonium test, ice pack test, serum autoantibody testing, electrophysiological assessments including repetitive nerve stimulation and single fiber electromyography (SFEMG), as well as computed tomography (CT) or magnetic resonance imaging (MRI) of the thymus, MG could still be difficult to differentiate from other clinical disorders,” explained the researchers.1“The fluctuations in symptoms and the subtleness of clinical findings make diagnosing MG difficult. While 13% of the patients experience a delay of more than 5 years in diagnosing the disease, 26% are given nonspecific diagnoses. Particularly among the elderly, the symptoms such as dysphagia, muscle fatigue, slurred speech, and ptosis (droopy eyelid) might be considered age-related and overlooked.”
By coupling attenuated total reflectance-Fourier transform infrared (ATR-FTIR) spectroscopy with machine learning, the researchers say MG diagnosis could be made more rapid, sensitive, and inexpensive. FTIR spectroscopy, which uses various fingerprint spectral windows across a range of variables potentially related to MG onset, has been used in other disease states, including bladder cancer and multiple sclerosis. Less invasive, the serum test allows for simultaneous acquisition of information on composition, concentration, and structure of macromolecules. ATR is the most commonly used sampling method for FITR spectroscopy.3
Taking blood from 24 patients with newly diagnosed, treatment-naïve MG and 42 sex-matched healthy controls, FTIR spectroscopy, together with multivariate analysis methods, including principal component analysis, support vector machine, discriminant analysis, and Neural Network Classifier, diagnosed MG with 100% accuracy, sensitivity, and specificity.1
Using the spectroscopy on the same samples, the researchers of the identified potential biomarkers for the disease, finding a significant decrease in amide A band (P < .0001), significant decrease in unsaturation index (P < .0001), significant increase in saturated lipid (P < .001) and protein (P < .0001) concentrations, and significant increase in DNA concentrations (P < .0001).
The researchers also observed a significant decrease (P < .0001) in the RNA/DNA ratio, suggesting an alteration in nucleic acid metabolism in MG and that DNA is more affected than RNA in the disease, as the increase in DNA concentration was more pronounced than that in RNA concentration.
Other identified biomarkers included protein phosphorylation parameter, protein structural changes, PO2 - antisymmetrical (antisym.) + symmetrical(sym.) stretching/protein, PO2- sym. stretching/total lipid ratios, amide I protein band’s position and bandwidth, and lipid dynamics parameter.
“Moreover, these potential biomarker molecules and/or metabolites are valuable messengers of early disease-related events,” described the researchers. “Principal Component Analysis (PCA), Support Vector Machine (SVM), discriminant analysis and Neural Network Classifier were applied to the infrared spectra for rapid MG diagnosis which revealed very successful discrimination of the MG disease group from the healthy control group.”
References: