Neurodevelopmental disorders(NDD) such as autism spectrum disorder (ASD), intellectual disability (ID). affect up to 15% of the population. These NDD produce disabilities which affect a person their whole life. In many cases, persons with NDD are unable to communicate about their subjective state, their feelings or their wishes, either because they are non-verbal, or because their communication is non-functional. Impaired communication greatly hinders the ability of families, caretakers, educators and clinicians to help the patients.
Wearable technology has gained relevance in the fields of health and wellness because of their potential to continuously monitor physiological parameters and are increasingly used to collect information which could facilitate and improve therapy. Other technological advances such as autocomplete methods are the technology phones use to finish your sentences. In its more sophisticated implementations, algorithms such as OpenAI’s GPT-3 or DALL-E are able to produce, starting from a short prompt, full newspaper articles or whole images which are difficult to distinguish from those produced by a creative human. I propose to explore the utility of these artificial intelligence methods with the help of physiological data such as Heart Rate Variability (HRV), Body temperature, accelerometer, respiration rate from wearables to assist people with neurodevelopmental disorders (NDD) communicate about themselves, in a sense, autocompleting the sentences that they are unable to produce.
My working hypothesis is that, although the communication skills of a person with NDD may be abnormal, the physiological information associated with different states and feelings is comparable with that of normal individuals. We propose to use dense, multimodal data captured from both neurotypical and neuroatypical subjects by wearable devices to prompt an artificial intelligence into completing the reports about feelings and wishes that a person with NDD could have produced were they able to communicate.