Investigating the interplay between affective, phonatory and motoric subsystems in Autism Spectrum Disorder using an audiovisual dialogue agent
(3 minutes introduction)
Hardik Kothare (Modality.AI, USA), Vikram Ramanarayanan (Modality.AI, USA), Oliver Roesler (Modality.AI, USA), Michael Neumann (Modality.AI, USA), Jackson Liscombe (Modality.AI, USA), William Burke (Modality.AI, USA), Andrew Cornish (Modality.AI, USA), Doug Habberstad (Modality.AI, USA), Alaa Sakallah (University of California at San Francisco, USA), Sara Markuson (University of California at San Francisco, USA), Seemran Kansara (University of California at San Francisco, USA), Afik Faerman (University of California at San Francisco, USA), Yasmine Bensidi-Slimane (University of California at San Francisco, USA), Laura Fry (University of California at San Francisco, USA), Saige Portera (University of California at San Francisco, USA), David Suendermann-Oeft (Modality.AI, USA), David Pautler (Modality.AI, USA), Carly Demopoulos (University of California at San Francisco, USA) |
---|
We explore the utility of an on-demand multimodal conversational platform in extracting speech and facial metrics in children with Autism Spectrum Disorder (ASD). We investigate the extent to which these metrics correlate with objective clinical measures, particularly as they pertain to the interplay between the affective, phonatory and motoric subsystems. 22 participants diagnosed with ASD engaged with a virtual agent in conversational affect production tasks designed to elicit facial and vocal affect. We found significant correlations between vocal pitch and loudness extracted by our platform during these tasks and accuracy in recognition of facial and vocal affect, assessed via the Diagnostic Analysis of Nonverbal Accuracy-2 (DANVA-2) neuropsychological task. We also found significant correlations between jaw kinematic metrics extracted using our platform and motor speed of the dominant hand assessed via a standardised neuropsychological finger tapping task. These findings offer preliminary evidence for the usefulness of these audiovisual analytic metrics and could help us better model the interplay between different physiological subsystems in individuals with ASD.