Cambridge Emotional Computer: Mind Reading GPS System

The 'emotionally sensitive' satnav: Cambridge scientist develops device that reacts to driver's moods

Professor Peter Robinson has created a satnav that recognises facial expressions and voice tone and can react based on the situation. The prototype feeds this data into software attached to a robotic human head that sits next to the driver

His device uses sensors to detect facial expressions, such as frowns, and voice recognition software to pick up rising irritation in the tone of a driver’s voice.

The prototype feeds this information into software attached to a robotic human head that sits alongside the driver.

When it detects a driver’s anger, it responds with sympathetic expressions.

The emotional computer

When people talk to each other, they express their feelings through facial expressions, tone of voice and body postures. They even do this when they are interacting with machines. These hidden signals are an important part of human communication, but computers ignore them.

Professor Peter Robinson is leading a team in the Computer Laboratory at the University of Cambridge who are exploring the role of emotions in human-computer interaction. His research is examined in the film The Emotional Computer, released on the University's YouTube channel (http://www.youtube.com/cambridgeuniversity).
Collected from: The emotional computer

Computer Laboratory: Emotionally intelligent interfaces

With the quick advances in key computing technologies and the heightened user expectation of computers, the development of socially and emotionally adept technologies is becoming a necessity. This project is investigating the inference of people's mental states from facial expressions, vocal nuances, body posture and gesture, and other physiological signals, and also considering the expression of emotions by robots and cartoon avatars.

Further information

Facial affect inference

Analysing a discouraging expression

Mind-reading machines

Processing stages in the mind-reading system

Body movement analysis

Body movement analysis

Vocal affect inference

Recording multi-modal cues for HCI

Affective robotics

Robot Charles, making various facial expressions

Learning and emotions

Automatic inference of affect

In the future, computers will be emotional - SmartPlanet

More Human, Less Machine
Would it help if the GPS system looked a little more like a person than a box? Robinson thinks people would respond to the machine better if it did look more human-like. The professor designed the almost-human-like robot named Charles.

Charles has motors in his face and cameras in his eyes. While he might be more friendly than a standard GPS system, he also looks a bit more creepy.

The Cambridge professor thinks differently: “The way that Charles and I can communicate shows us the future of how people are going to interact with machines.”

Robinson turns to Charles in the car. “Hey Charles, I think this is the beginning of a beautiful friendship,” he said.