Blog

Companies Already Use Facial Recognition to Analyze Their Customer’s Emotions – When Will Yours?

Going beyond biometrics.

The global facial recognition market was valued at more than $3.8 billion in 2020, and is expected to grow to $8.5 billion by 2025 with an astounding 17.2% CAGR.

You may know this technology as means of surveillance, but it’s much more than that. This market includes uses in different fields such as security, marketing, and TV and movie productions, among others. 

Here’s the basic idea: our face shows our internal emotional states, and by looking at our facial expressions, we’ll know what’s happening inside. Our face is the gateway to our emotions and desires.

Usually, we don’t control our facial movements. We just react without filtering our emotional state. By measuring the changes in our face’s surface, we will tap into a more genuine reaction to the products and services being tested. 

We want to know how our customers feel moment by moment. Instead of just applying a survey after experiencing our content, we’ll know precisely what they feel and when they feel it.

Which companies are already using this tech to get ahead of their competitors?

Snapchat: in-house software development

The messaging company with multimedia support (image, video, and augmented reality filters) is positioned as the preferred social platform for young people between 15 and 28 years old. Users share around 400 million video and image messages every day.

With this much traffic, Snapchat is exploring their customer’s emotions with facial recognition tech. They started focusing on measuring the emotions of groups of people gathered at different types of events.

Whether it’s a concert or a political speech, Snapchat’s software can recognize the audience’s emotional level during the whole event, moment by moment. Thus, not only can they measure the public’s general feeling about the event, but they can also pinpoint how the audience felt at a key moment.

If you cross-reference it with the speaker’s content at that moment or the specific song played in that bit, you’ll get valuable information about your customers.

In other words, Snapchat is able to get a live feed of people’s emotions during an event.

Think of producers, managers, or event hosts at these gatherings. They would certainly want to have this information at hand, right? Imagine how they could adapt to their audience guided by the emotional feedback being received at any given time.

They can make the necessary changes to evoke a certain feeling in their audience. In a way, they’re communicating with a big organism and adapting to it throughout time.

Does your company have access to this kind of data?

Hyundai: human-machine interactions

The South Korean auto company is incorporating facial recognition technology to improve their customer’s experience of their cars. With an internal camera, it can interpret the different emotional states of the driver and react accordingly.

For instance, if the software detects that the visual attention is diminishing and that the driver is showing signs of fatigue, it can play a specific sound to wake him up.

This type of tech allows a more empathetic interaction with the driver. It enhances the driving experience by creating an AI system that reacts more spontaneously to its drivers' personality and behavior. 

For example, when the driver shows joy on his face, the AI ​​can tell him a joke or talk to them in a more casual tone. It could adapt its message depending on the driver's mood at the time.

Tech is helping AI to become more human, and drivers feel it.

To achieve this, Hyundai seeks to improve how the AI recognizes the driver’s emotional state more efficiently. Otherwise, it could be misinterpreting the driver’s emotions and trying to communicate without taking into account how they actually feel. This requires adding more sensors, especially around the driver's seat. The tech will be able to measure additional physiological aspects such as heart rate and skin conductance. This data will generate a more effective evaluation of the real state of driving and, therefore, a more congruent interaction with the AI ​​system.

These technological innovations are ultimately designed to make the driving experience more enjoyable.

Walmart: improving customer experience

The retail giant conducted a pilot program where researchers placed cameras at the checkout counters of several stores to track customer emotions. The cameras were able to detect facial expressions such as happiness, frustration, and anger, and the data was used to improve the customer experience.

For example, if the cameras detected that a customer had a frustrated expression, the store managers could take steps to address the issue, such as sending an employee to help with bagging or providing additional assistance with a purchase. Additionally, the data collected was also used to identify patterns and trends, such as peak times when customers were more likely to be frustrated. This allowed store managers to take proactive measures to improve the shopping experience.

The pilot program was conducted in several stores in the US, and according to Walmart, it led to a significant improvement in customer satisfaction and a reduction in customer complaints.

Facial recognition tech in Spain

Currently, facial recognition technology is being used mainly in the field of security.

CaixaBank allows you to withdraw money from an ATM only by registering the customer's face. That is, without having to enter the PIN. BBVA allows you to open a bank account with a selfie. Facial technology is used mainly to facilitate the user’s experience in managing its assets.

In the airline industry, Aena, along with Air Europa, is allowing its passenger at the Menorca airport, to board only by showing their facial features. That is, without having any additional documents. You don’t need to use your boarding pass, and there’s no human identity check.

It’s still in a trial period, but it’s already showing great benefits during its initial implementation phase.

It might lead to a much more comfortable flying experience.

The future of emotional analysis through facial coding

The most outstanding advances in this tech are happening in the area of ​​biometrics. Very few know about the use of facial coding for emotional analysis. Thus, we need to show that there are other possibilities for this technology as well.

At Alyze, we’ve developed an online platform that will enable you to create studies and measure people’s emotional reactions to your content. The client can design and launch their own tests, or they can ask us to do it for them. We can measure how people react to certain types of stimuli (visual, auditory, or audiovisual material) and map them out moment by moment.

Once companies know how participants react to their products and services, they can be more efficient at how they target their consumers. We want to bring emotional measurements closer to our clients and help them get a feeling of their audience. 

Think about it. How can you create a stronger bond with your customer if you don’t know how they feel about you?

Marketing departments should have ongoing research all the time. Both small and big product launches have to be included in this research process.

Whether it’s for an ad, creating a magazine cover, editing movie trailers, or for any other product or service companies deliver, people react to all of this. 

Why not change the way we do research and make it a common practice among these different industries to use facial recognition?

Creative and financial decisions based on evidence can have a more lasting effect. 

Facial recognition software is already a reality, but the question remains: are you prepared for it?

Let's Talk
Contact Us