Find out how new emotion analytics algorithms are bringing feeling, emotional connection, and engagement to modern apps.
Most app development focuses on what an application is supposed to do. However there is a growing interest in crafting applications that can respond to how we feel by weaving emotion-aware signals gleaned from faces, voices, and text. At the O’Reilly Design Conference in San Francisco, Pamela Pavliscak, CEO of SoundingBox, explored how applications that senses our emotions are already changing the way we relate to technology and other people.
Already cameras are being used to capture emotional expressions in faces, microphones can analyze emotional tone of conversations, and sentiment analysis techniques are used to make sense of how people are feeling in social media. In addition, new edge devices with sensors gather data about our heart beat, brain waves, and the electrical conductivity of skin to make even more nuanced assessments of emotions. This was once the realm of self-hackers and high budget marketing teams. But now it is starting to go mainstream. Pavliscak said, “With Feelings 2.0, we are told we will get a new wave of technology that can read emotions and this will lead to a more emotional connection.”
Apple for instance recently bought out Emotient, which analyzes emotional expression in the face. This could help bring emotion recognition capabilities to iOS. In addition, both Microsoft and IBM now offer a suite of emotion analytics capabilities as part of their cloud offerings.
How does emotional analytics work?
Text analytics is probably one of the most widely used techniques today. It analyzes word choices to associate sentiment with what people are writing. Companies can use these techniques to understand the impact of their brand or service. More intimate and personalized implementations can improve chatbot interactions. One simple implementation is the Hedonometer which measure happiness of twitter users and associates these with news and trends. But the current crop of tool struggle to interpret sentiment in sarcasm said Pavliscak.
Vocal intonation algorithms analyze speech to determine voice impact. Companies like Beyond Verbal have implemented an API that can be leveraged as part of call centers or to improve spoken chat interactions. This could be leveraged by call centers to create business process for escalating calls from frustrated customers. They have also developed a smartphone app called Moodcall that can measure how phone calls affect your mood.
Who is using emotional analytics?
A number of companies have developed APIs for recognizing emotions expressed in faces. These techniques are based on the research of Paul Ekman who identified 5-universal emotion patters expressed across all cultures, popularized in the movie Inside Out. The first generation of these tools used cameras in stores to anonymously analyze the emotional impact of new products. Now developers are starting to use facial expression analytics to improve game play.
For Full Story, Please click here.