Sorry, you need to enable JavaScript to visit this website.

Entrainment Analysis for Assessment of Autistic Speech Prosody Using Bottleneck Features of Deep Neural Network

Citation Author(s):
Keiko Ochi, Nobutaka Ono, Keiho Owada, Miho Kuroda, Shigeki Sagayama, Hidenori Yamasue
Submitted by:
Keiko Ochi
Last updated:
13 May 2022 - 1:58am
Document Type:
Poster
Document Year:
2022
Event:
Presenters:
Keiko Ochi
Paper Code:
3110
 

In the present study, we quantify entrainment characteristics of conversation with the aim of automatic assessment of the severity of autism spectrum disorder (ASD). We focus on pairs of utterances immediate before and after turn-takings, which have prosodic/acoustic similarities.
The clinical severity of ASD is estimated by the bottleneck features obtained by an hourglass-shaped deep neural network (DNN) in the neural entrainment distance (NED) method used to measure the degree of entrainment. The DNN is firstly pre-trained using a large conversation corpus in various daily situations and then fine-tuned with conversations during the Autism Diagnostic Observation Schedule (ADOS) assessment. Absolute difference vectors are calculated from the bottleneck feature vectors between a pair of utterances. Centroid and variance of the absolute difference vectors are combined with the speech features discovered in our previous study in order to estimate the scores of ASD severity.
Consequently, the estimated scores significantly correlate with the actual observed ADOS ‘Reciprocity’ scores with a coefficient of 0.70. This result shows the effective use of fine-tuning technique with data of typically developed individuals and, furthermore, reveals social communication deficits in ASD individuals represented by utterances adjacent to turn-takings.

up
0 users have voted: