An abnormality in neural connectivity is linked to autism spectrum disorder (ASD). There is no way to test the concept of neural connectivity empirically. According to recent network theory and time series analysis findings, electroencephalography (EEG) can assess neural network architecture, a sign of activity in the brain. This systematic review aims to evaluate functional connectivity and spectral power using EEG signals. EEG records the brain activity of an individual by displaying wavy lines that depict brain cells’ communication through electrical impulses. EEG can diagnose various brain disorders, including epilepsy and related seizure illness, brain dysfunction, tumors, and damage. We found 21 studies using two of the most common EEG analysis methods: functional connectivity and spectral power. ASD and non-ASD individuals were found to differ significantly in all selected papers. Due to high heterogeneity in the outcomes, generalizations cannot be drawn, and no single method is currently beneficial as a diagnostic tool. For ASD subtype delineation, the lack of research prevented the evaluation of these techniques as diagnostic tools. These findings confirm the presence of abnormalities in the EEG in ASD, but they are insufficient to diagnose. Our study suggests that EEG is useful in diagnosing ASD by evaluating entropy in the brain. Researchers may be able to develop new diagnostic methods for ASD which focuses on particular stimuli and brainwaves if they conduct more extensive studies with higher numbers and more rigorous study designs.
Textual comprehension is often not adequately acquired despite intense didactic efforts. Textual comprehension quality is mostly evaluated using subjective criteria. Starting from the assumption that word usage statistics may be used to infer the probability of successful semantic representations, we hypothesized that textual comprehension depended on words with high occurrence probability (high degree of familiarity), which is typically inversely proportional to their information entropy. We tested this hypothesis by quantifying word occurrences in a bank of words from Portuguese language academic theses and using information theory tools to infer degrees of textual familiarity. We found that the lower and upper bounds of the database were delimited by low-entropy words with the highest probabilities of causing incomprehension (i.e., nouns and adjectives) or facilitating semantic decoding (i.e., prepositions and conjunctions). We developed an openly available software suite called CalcuLetra for implementing these algorithms and tested it on publicly available denotative text samples (e.g., articles, essays, and abstracts). We propose that the quantitative model presented here may apply to other languages and could be a tool for supporting automated textual comprehension evaluations, and potentially assisting the development of teaching materials or the diagnosis of learning disorders.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.