Children with autism spectrum conditions often have trouble identifying the emotional states of people around them, struggling to distinguish a happy face from a sad one, for example. Now, researchers from MIT have designed a method based on deep learning to help teach autistic children how to read emotions more effectively using data unique to each individual child.
The researchers used SoftBank Robotics NAO, a humanoid robot that’s nearly 60cm tall. NAO can convey different emotions by changing the colour of its eyes, the motion of its limbs, and the tone of its voice. The robot interacted with 35 children aged from 3 to 13, gauging their responses to displays of several different emotions such as happiness, anger and fear.
While doing this, the team captured video of each child’s facial expressions, head and body movements and gestures, and recorded data on their heart rates and body temperatures. They then fed this data into a deep learning system that analysed the children’s behaviour and engagement.
They found that the deep learning system agreed with the analyses of five human experts 60 per cent of the time. Typically human experts are in agreement around 50 per cent of the time.
“The long-term goal is not to create robots that will replace human therapists, but to augment them with key information that the therapists can use to personalise the therapy content and also make more engaging and naturalistic interactions between the robots and children with autism,” said study leader Dr Oggi Rudovic.
What is deep learning?
Deep learning is the name for computer programs that simulate networks of neurons like those in our own brains. They’re hugely simplified and don’t really work in quite the same way that real neurons work, but nevertheless, they enable computers to learn.