Autism and emotions: artificial intelligence reveals their neural encodings


A study shows that facial emotions are encoded in the brains of people with autism (ASD). Published in the journal Biological Psychiatry, the joint University of Trento-Stony Brook University study dismantles some beliefs about brain functioning in people with ASD and opens up new scenarios to improve their relational life. With machine learning, a representation of the neural models that each brain applies to decode emotions has been created. Riccardi (University of Trento): “An interdisciplinary approach is essential” Emotions are a universal language and can usually be recognized easily and naturally. This is not the case for people with Autism Spectrum Disorder (ASD) for whom this simple activity is very limited at best. The reason for this difficulty has for years been the focus of scientific studies that try to shed light on the functioning of the brain in individuals affected by these disorders. A study by the University of Trento and Stony Brook University of New York published a few days ago in pre-print version in the journal Biological Psychiatry: Cognitive Neuroscience and Neuroimaging questions many beliefs and opens up new scenarios to improve living conditions and the social relationships of people with ASD. Reading facial expressions and decoding emotions is actually difficult for those with autism spectrum disorders. But the reason lies not in the brain’s ability to encode neural signals – as has always been thought – but rather in problems in the translation of information. A problem that in this period is also exacerbated by the containment measures of the pandemic. “Particularly now with the constant use of protective masks – explains Matthew D. Lerner, co-author of the study and professor of Psychology, Psychiatry and Pediatrics at Stony Brook University – limits the expressiveness of the face and this leads to less availability of information on our emotions. This is why it is important to understand how, when and for whom comprehension difficulties arise, what are the mechanisms underlying the misunderstanding ». The study’s conclusions are the result of a long analytical work that used machine learning techniques and could be useful for reviewing the approach with which people with ASD are helped to read the emotions of others. «At the moment there is a tendency to use prostheses for the recognition of emotions that help the visual perception of biological movement. Our results suggest that we should instead focus on how to help the brain transmit an intact encoding of the message that conveys the correctly perceived emotion “. Reading emotions with machine learning The study was conducted jointly by a group of researchers from Stony Brook University in New York and the University of Trento (Department of Engineering and Information Sciences) on 192 people of different ages with and without autism spectrum disorders. Their neural signals were recorded while displaying many facial emotions and subsequently analyzed. To do this, the research team employed a new facial emotion classification system that leverages machine learning, called Deep Convolutional Neural Networks. This “machine learning” approach includes an algorithm that allows you to analyze and classify the activity of the brain while observing faces, detected by electroencephalography (EEG). The result is a very accurate map of the neural patterns that each person’s brain applies to decode emotions. “Technologies derived from machine learning are generally considered to be an engine of innovation in processes and products in all industrial sectors”, comments Giuseppe Riccardi, co-author of the study and professor of Information Processing Systems at the University of Trento (Department of Engineering and Information Science). “And it is also evident in this case. Machine learning techniques can help us interpret brain signals in the context of emotions. First of all, they can be decisive in supporting the early stages of basic scientific research. But they can also be used directly for clinical interventions. The study we conducted shows how much a strong integration of interdisciplinary skills is necessary for artificial intelligence to have a measurable impact on people’s lives “.



AI Coach



Torres M. J., Clarkson T., Hauschild K., Luhmann C. C., Lerner D. M. and Riccardi G., “Facial emotions are accurately encoded in the brains of those with autism: A deep learning approach”, Biological Psychiatry: Cognitive Neuroscience and Neuroimaging, 2021 .

Torres M. J., Ravanelli M., Medina-Devilliers S., Lerner D. M. and Riccardi G., “ Interpretable SincNet-based Deep Learning for Emotion Recognition in Individuals with Autism, ” IEEE Conf. Engineering in Medicine and Biology, Conference, 2021.

Mayor Torres, J.M., Clarkson, T., Luhmann, C. C., Riccardi, G., Lerner, M.D., Distinct but Effective Neural Networks for Facial Emotion Recognition in Individuals with Autism: A Deep Learning Approach ” Annual Meeting of the International Society for Autism Research, Montreal, 2019.

Mayor Torres, J.M., Libsack, E.J., Clarkson, T., Keifer, C.M., Riccardi, G., Lerner, M.D., “EEG-based Single trial Classification Emotion Recognition: A Comparative Analysis in Individuals with and without Autism Spectrum Disorder ” Annual Meeting of the International Society for Autism Research, Rotterdam, 2018.