Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
WMT Group experts explain how emotional AI technologies, from virtual assistants to adaptive advertising, are becoming more human and helping to solve various problems
Communication is at the core of many business processes: more and more companies are looking to improve communication culture, engagement, and productivity. That’s why the market for AI-powered emotion recognition is growing : in 2019, it was valued at $19 billion, and is expected to grow to $37.1 billion by 2026.
How to teach AI to understand emotions
Many business processes depend on communication between people, and the emotional context changes the interpretation of facts and arguments. Therefore, to automate and optimize tasks related to communication, a special artificial intelligence is needed – emotional (Emotion AI). It can recognize positive and negative human reactions, analyze facial expressions and body language, voice tone and even the choice of words.
Emotion AI is based on Machine Learning (ML), neural networks, and Natural Language Processing (NLP) to analyze and interpret human emotions. Training requires significant amounts of information and complex algorithms. Large data sets are used for this, including facial images, facial expressions, audio recordings of people in various emotional states, and text data from chats and social networks. The larger and more diverse the arrays of information, the more accurate and reliable the model.
Next, deep neural networks are connected to the process:
Convolutional Neural Networks (CNN) analyze images and videos: it recognizes patterns in facial expressions that correspond to certain emotions (for example, smiling and frowning).
Recurrent Neural Networks (RNN) and its variations: are used to analyze time-series data (such as audio recordings) to detect changes in voice tone and timbre.
Language models such as GPT help determine the emotional content of text based on context and word order.
To improve the accuracy of an emotion recognition system, multimodal integration is often used, where data from different sources, such as video, audio, and text, are combined.
Once trained, Emotion AI can classify new data in real time, determining what emotions a person is experiencing. For example, when analyzing a video, the model can simultaneously consider facial expression, tone of voice, and text to make a more accurate conclusion. And reinforcement learning technologies are used to improve the AI’s performance based on feedback from users or external systems.
Who needs it and when
AI-powered emotion recognition systems help understand how people are feeling, improving communications and service, making digital products more adaptive, and making interactions with them more effective.
Corporate sector
Imagine a manager reporting on the tasks for the previous month during a video conference, using figures and facts, but feeling nervous because the entire top management is present on the call. The speaker’s emotional background is read by AI, promptly offering phrases to those present at the meeting, highlighted on the screen, mitigating the manager’s stress level and improving the presentation.
Such an AI solution has recently appeared on the Russian market — our company has developed an online AI-based communications assistant (ACM). It consists of two products: AC Meet reads communication during video calls and provides tips in real time, highlights emotion levels and stress phases; AC HRM collects analytics based on all video meetings and helps the HR department understand the psychological state of the team.
AI can analyze employees’ emotional reactions to management decisions, identify potential conflicts at an early stage, and prevent them from escalating. Technologies also help prevent employee burnout by adapting processes depending on the team’s state: suggesting changes to the schedule, providing additional resources for support, or recommending breaks for recovery.
Emotion recognition is also effective in recruiting. For example, it is actively used by the international manufacturer of consumer goods Unilever. With the help of AI, it analyzes video interviews of candidates: evaluates facial expressions, tone of voice, and body language to determine how suitable a person is for a particular role.
Sales area
During a telephone or video conversation, AI analyzes the tone and content of speech, as well as facial expressions if the camera is on. Zoom IQ for Sales has this option . The system can prompt the manager with a suitable script for changing the strategy or special offers that can increase customer loyalty, and also identify during the conversation what negatively affects sales figures.
Technologies will allow companies to better understand how customers react to products or services and tailor offers in real time. In an online store where communication occurs via text chat, AI will analyze the words and tone of the customer’s messages. If they seem dissatisfied, they will be offered a discount or help from a manager. Marketplaces can already implement such technologies to improve service and increase sales.
Support Services
AI helps operators evaluate customer reactions and their mood, which is especially important in stressful situations – it reduces the number of conflicts. More and more often, the work of contact centers is fully automated: Emotion AI does not just voice answers, but uses a certain intonation and emphasizes speech, which is typical for a person. It also understands when it is worth redirecting a client to an employee to resolve a difficult situation and reduce the level of negativity. Such systems have already been implemented in large Russian banks and communications companies, for example, in MTS .
Medicine
In healthcare, AI can be used to assess patients’ emotional states, monitor mental health, or support therapy. For example, systems can analyze facial expressions and speech to detect signs of depression or anxiety disorders.
Another application is to help care for patients with autism: Emotion AI helps them better understand the emotions of others, improving social skills.
Education
Emotion recognition allows you to adapt teaching methods to the emotional state of students. The rapid development of online schools and distance education has increased the need for such solutions. AI reads reactions to various tasks and topics, which allows teachers to better understand which approaches work and which do not. For example, if the system detects that students are bored or do not understand the material, the professor can offer a simpler or more interactive practice.
In Russia, a similar solution for schoolchildren was presented by one of the Rostec structures. The development includes a monitor equipped with cameras and special software: it reads the psychophysical data and emotional state of the student at the beginning of the lesson and evaluates changes in the process. Based on the data, the system generates recommendations for the teacher, which affects the level of mastery of the material.
Marketing
Emotion AI is used to analyze reactions to advertising, content or products, allowing brands to better understand their audience and tailor their offerings.
For example, before launching commercials, Coca-Cola tests them on focus groups: artificial intelligence recognizes emotions when viewing in real time. AI systems evaluate microexpressions on viewers’ faces, which helps the company understand which aspects of advertising cause joy, surprise, or other emotions. Video hosting YouTube uses AI emotion recognition technologies to customize recommendations and optimize advertising placement. Russian companies, such as Neurobotics, are also able to analyze reactions to videos.
Weaknesses of Emotional AI
There are a number of limitations when using AI technologies to recognize emotions.
Legislative restriction
User consent is required when working with Emotion AI. Recording or recording confidential conversations without permission may result in a lawsuit.
Experienced developers of such products usually take into account the peculiarities of the legislation in order not to expose clients to risk. For example, ACM for measuring employee stress collects regular personal data, consent for the processing of which is signed upon employment. The employer’s right to analyze emotions is secured by internal regulations as part of control over labor activity – therefore, employees need to be familiarized with ACM by signature.
Risks in unreliability and inaccuracy of forecasting
At the moment, AI is not able to understand human mood with 100% accuracy. To improve quality, it is necessary to systematically process a large amount of data and patterns, synthesize the tone of voice, analyze the context of conversations and non-verbal communications. For now, unfortunately, incorrect hints are possible due to incorrectly recognized emotions.
Responsibility for the performance of an AI product
Who bears it: developers, users or the project manager? The question remains open. If the product was implemented by a contractor, then the customer is not always ready to share all the information about the internal experience of use. This complicates the work on revision. There are also risks of inaccuracy of algorithms if the implementation took place, for example, in unusual conditions.
Prospects for the development of technology
Improvements in emotion recognition accuracy are expected due to the development of ML algorithms and increased volumes of training data. This will allow us to create more reliable and effective tools for emotion analysis that will be able to take into account not only explicit but also hidden emotional states.
Emotion AI is likely to find applications in new areas. For example, in sports analytics, emotion recognition technologies can be used to assess the state of athletes during competitions, which can help coaches optimize training strategies and team management.
In the future, it may be possible to create artificial intelligence that can not only recognize but also synthesize emotional responses. This will lead to more natural and intuitive interfaces. Such systems can be used in virtual assistants to emotionally support users, taking the “humanity” of AI to a new level.
But the success of Emotion AI depends on more than just technological advances. It also depends on how we integrate new capabilities into everyday life, balancing convenience and ethics.