Emotions are the one that adds taste to any sentence. The way we talk, the pitch of the voice, the moments of eyes and the speed of breath all these things draws a picture of the emotions that we feel. Over the period of time, we teach AI how to communicate and interact with humans but still, they give us a rough fell and their answers are mostly logic base.
Solving this problem researchers now build a new AI that analyzing and interacting with humans in the form of emotional intelligence and understanding. It is created from a 10-year-old concept of affective computing. Affective computing focuses on understanding emotions expressed via speech, written text, and videos.
Inspiring from this old technic, researchers came with a new branch of AI, known as behavioral signal processing, which enables computers to understand human emotions and turn them into actionable insights.
Behavioral signal processing can understand typical, atypical and distressed human behavior with a specific focus on communicative, effective and social behavior. It automatically detects the information that is encoded in the human voice from audio and measures the quality of human interaction.
Behavioral signal processing algorithms can determine what behaviors and emotions caused a reaction and when there were turning points. These insights can then be transformed into analytic reports and eventually teaching tools, deriving true value from often-disregarded unstructured data.
Using examples drawn from different application domains such as Autism, Addiction, and Metabolic health monitoring, the talk will also illustrate Behavioral Informatics applications of these processing techniques that contribute to quantifying higher-level, often subjectively described, human behavior in a domain-sensitive fashion.
The Behavioral Signals Emotion Recognition software brings emotional intelligence to the digital world, transforming not only how humans interact with technology, but especially how humans interact with each other. This is emotion-aware computing.
In addition to processing objectively-specified behavioral content in richer ways (e.g., what someone said and did), Behavioral Signal Processing entails automating a host of subjectively-specified entities such as those related to socio-emotional states of people (e.g., how negative or frustrated a person is; politeness; engagement etc).
Alex Potamianos, CEO of Behavioral Signals, has spearheaded new “speech-to-emotion” AI technology, which “[analyzes] emotions and behaviors in speech, things like if you’re happy, angry, sad.”
Potamianos says that much of Behavioral Signals’ AI research comes from data shared by call centers and mentions the example of identifying a customer’s emotion upon calling in and correlating it with what is being said.
Behavioral signal processing advances can enable not only new possibilities for gathering data in a variety of settings–from laboratory and clinics to free-living conditions but also in offering computational models to advance evidence-driven theory and practice.
For future applications of the software, Potamianos believes that it could be useful in health care, as the AI could learn to understand the onset of certain mental illnesses (e.g. depression) through analysis of the human’s voice and speech.
More in AI :