06. 11. 2007
Warn me if I'm boring
A device warning a speaker that they're in danger of losing the attention of the crowd would be useful to a professor, but absolutely necessary for an autistic adult unable to read body language and emotion. MIT grad student Rana El Kaliouby uses intelligent software to interpret emotions from body language and facial expressions captured via a wearable computer/video camera combination. The video data is used to decide whether the listener is agreeing, disagreeing, thinking, concentrating, interested, or unsure. The result is an Emotional Social Intelligence Prosthetic that will vibrate whenever the listener's attention veers off topic. The eventual goal for the system is to provide the mildly autistic with an auxiliary notification of other people's emotions on a daily basis.
To train her software El Kaliouby used video of actors able to very clearly define an emotion on film. Now, the system is able to pick out the right emotion 90% of the time when using actor footage, and 64% of the time with video clips of everyday people. The recognition rate should greatly improve as more footage is added to the database, and the next round of training footage is coming from popular movies and webcams.
Illustration above from your friends and mine at the NYT