The clinical, emotionless computer or robot is a staple of science fiction, but science fact is starting to change: computers are getting much better at understanding emotions.
As we turn to computers, smart devices and robots to do more and more functions that have always been the exclusive domain of humans, this emotion-detecting technology will become increasingly important. Automated customer service “bots” will be better able to know if a customer is getting the help they need. Robot caregivers involved with telemedicine may be able to detect pain or depression even if the patient doesn’t explicitly talk about it. One insurance company I am working with is even experimenting with call voice analytics that can detect that someone is telling lies to their claims handers.
IBM’s Watson has developed a ‘Tone Analyzer’ that can detect sarcasm and a multitude of other emotions from your writing. It also has an Emotion Analysis API, to help users understand the emotion of people they're chatting with.
These sorts of advancements are important for computers and robots that hope to seamlessly interact with humans. They may not yet pass the Turing test, but recognizing emotions gets them a step closer.
This particular branch of computer science is known as affective computing, that is, the study and development of systems and devices that can recognize, interpret, process, and simulate human experiences, feelings or emotions.
But it’s also related to deep learning, because complicated algorithms are required for the computer to perform facial recognition, detect emotional speech, recognize body gestures, and other data points. The computer compares the data input — in this case, a human with whom it is interacting — to its learning database to make a judgement about the person’s emotions.
A University of Ohio team programed a computer to recognize 21 ‘compound’ emotions including happily surprised and sadly disgusted. And, in tests, the computer was more accurate at recognizing these subtle emotions than the human subjects were.
That’s because the computer’s pattern recognition capabilities are superior to a human’s, and because many people tend to use the same facial muscle movements to indicate the same emotions.
Other than avoiding a HAL 2000 scenario, the potential for these applications is enormous.
Clearly this technology could have many potential benefits. But, as with any technological advancements, there could also be pitfalls. Woe to the person who seems nervous in an airport when he or she is simply running late. And don’t let your computer catch you making angry or mocking expressions just after a meeting with your boss.
(If you’re not sure how you feel about this, why not check out one of the many emotion recognition apps that will tell you how you’re feeling.)
Bernard Marr is a bestselling author, keynote speaker, and advisor to companies and governments. He has worked with and advised many of the world's best-known organisations. LinkedIn has recently ranked Bernard as one of the top 10 Business Influencers in the world (in fact, No 5 - just behind Bill Gates and Richard Branson). He writes on the topics of intelligent business performance for various publications including Forbes, HuffPost, and LinkedIn Pulse. His blogs and SlideShare presentation have millions of readers.