Artificial Intelligence, or AI, is being leveraged by companies to aid in communications. Most of us have come into contact with this kind of AI through products like Siri and Alexa, but did you know that there are bots being used on video and audio calls to assess your nonverbals?


How call centers are using AI

Call centers are one of the larger consumers of AI for audio analysis. When on a call with a customer, AI is tasked with analyzing vocal signals like changes in pitch and volume, and then reporting back to the call center representative in real time. This type of technology has been around for a while, so it has become better at distinguishing between emotions. Demos of this show that the AI will coach the representative with phrases like, “Customer is frustrated – show empathy.” You may think that signs of frustration are pretty obvious, but some people aren’t as intuitive as others, or they don’t know how to handle the situation. That’s where having an AI coach can help. It can send messages to the representatives in real time with suggestions on how to respond to customers to de-escalate tensions.


How AI is being used on video calls

This type of AI technology is much newer than AI for analyzing audio. Businesses, such as sales organizations, have recently tried out AI that reads facial expressions and body language over video calls. This is usually achieved by recording the virtual meeting, which is then analyzed by the AI. A report on the nonverbals of all participants is emailed to the sales team about 20 minutes after the call. How accurate is the analysis? On the demo we tested, it had some major errors. This AI product took pictures throughout the meeting to analyze nonverbals of participants. The AI mistakenly reported our test participant’s expression of laughter as disgust. These are near opposite emotions, and while they may look similar in photos, would not be confused when seen in context by a human. Another issue with this AI is that the participants video would have to be on at all times during the meeting. If you had a smiling profile photo showing on the virtual meeting instead of turning on your video camera, the AI would report how happy you were during the entire meeting. How helpful is that?


Why many people find this technology creepy

I know of some cases where participants in virtual meetings have not been aware that their body language and nonverbals were being recorded by an AI bot. When I told them what to look for, they realized that they had been in sales meetings where it hadn’t been disclosed to them that they were being recorded and their nonverbals were being analyzed. Be aware that this kind of AI can be added onto a Zoom meeting with out triggering the “This meeting is being recorded” message. One of the AI companies I spoke with told me they recommended that clients inform participants prior to the meeting that it will be recorded. While recording people without their knowledge is illegal in some states, the AI companies are not enforcing disclosure.


How to recognize if an AI bot is recording your meeting

For future reference, here’s your clue: an unnamed participant shows up as a black box (no video) in your virtual meeting with a tag like “note taker.” That is a common way to “name” the AI bot, although it could be tagged with another role, such as “partner” or, if they are really trying to fool you, it could be given a name. If you see that, keep in mind you are being recorded, whether they disclose it or not. That’s where many executives feel the use of AI crosses the line into being “creepy.” If that bothers you, say something and request that they take the AI bot off your meeting.


When it comes to “reading the room,” I believe that people with training are more reliable than AI

I spoke with the CEO of one of the few companies that offer body-language reading AI in order to prepare for media interviews on this topic. Their approach focuses on AI noting body movements, such as when someone leans in or out. Usually, these movements indicate levels of engagement. However, leaning back can mean disengagement OR it can mean that someone’s back is bothering them. This is where human judgement has an advantage over AI. Another area is in recognizing micro expressions. These are split-second expressions or body language movements that are often missed by AI. I’ve trained many executives on how to pick up these fleeting nonverbals, and it has helped them gain more influence in meetings. If you have an interest in learning more about how to read people on virtual meetings or in person, visit ReadTheZoom.com