The 'emotionally sensitive' satnav: Cambridge scientist develops device that reacts to driver's moods
Professor Peter Robinson has created a satnav that recognises facial expressions and voice tone and can react based on the situation. The prototype feeds this data into software attached to a robotic human head that sits next to the driver
His device uses sensors to detect facial expressions, such as frowns, and voice recognition software to pick up rising irritation in the tone of a driver’s voice.
The prototype feeds this information into software attached to a robotic human head that sits alongside the driver.
When it detects a driver’s anger, it responds with sympathetic expressions.
Collected from: Sat-nav that's 'emotionally sensitive': Cambridge scientist develops device that reacts to driver's moods | Mail Online
The emotional computer
[...]When people talk to each other, they express their feelings through facial expressions, tone of voice and body postures. They even do this when they are interacting with machines. These hidden signals are an important part of human communication, but computers ignore them.
Professor Peter Robinson is leading a team in the Computer Laboratory at the University of Cambridge who are exploring the role of emotions in human-computer interaction. His research is examined in the film The Emotional Computer, released on the University's YouTube channel (http://www.youtube.com/cambridgeuniversity).
Collected from: The emotional computer
Collected from: YouTube - Cambridge Ideas - The Emotional Computer
Computer Laboratory: Emotionally intelligent interfaces
With the quick advances in key computing technologies and the heightened user expectation of computers, the development of socially and emotionally adept technologies is becoming a necessity. This project is investigating the inference of people's mental states from facial expressions, vocal nuances, body posture and gesture, and other physiological signals, and also considering the expression of emotions by robots and cartoon avatars.[...]
Further information
Collected from: Computer Laboratory: Emotionally intelligent interfaces
Collected from: Computer Laboratory: Facial affect inference
Collected from: Computer Laboratory: Mind-reading machines
Collected from: Computer Laboratory: Body movement analysis
Collected from: Computer Laboratory: Vocal affect inference
Collected from: Computer Laboratory: Affective robotics
Collected from: Computer Laboratory: Learning and emotions
In the future, computers will be emotional - SmartPlanet
More Human, Less Machine
Would it help if the GPS system looked a little more like a person than a box? Robinson thinks people would respond to the machine better if it did look more human-like. The professor designed the almost-human-like robot named Charles.
Charles has motors in his face and cameras in his eyes. While he might be more friendly than a standard GPS system, he also looks a bit more creepy.
The Cambridge professor thinks differently: “The way that Charles and I can communicate shows us the future of how people are going to interact with machines.”
Robinson turns to Charles in the car. “Hey Charles, I think this is the beginning of a beautiful friendship,” he said.
Collected from: In the future, computers will be emotional - SmartPlanet
Related