We analyze what people feel while they are watching an image, a video, or even listening to a soundtrack, without having to ask any questions – all by tracking their facial expressions.
We use panels of volunteers who ‘opt-in’ like they would a quiz or a survey and simply allow their webcam while they watch a commercial, a movie, a TV format or a series of images. This enables us to capture the viewers subconscious at scale and report the data online. The approach is a win for everyone: companies know where to optimize their video content and know which audiences will be most engaged, meaning customers don’t see boring or irrelevant content.
How does it work ?
Our solution is based on Artificial Intelligence: we teach machines to interpret, quantify and visualize human emotions. We have the largest emotional database in the world that’s linked to real-world outcomes. For the last 10 years we have been collecting billions of data points in more than 75 countries. When we talk about facial tracking, we must know that, for example, a Korean face does not have the same expressions as a French or South American face. Yet we feel universal emotions like joy, surprise, and sadness, whatever our way of showing it. What can be difficult to interpret for a human eye is nothing for a computer: machines are self-learning, interpreting micro expressions and map those to emotional states. Humans validate the data and so eventually machines become more accurate in decoding expressions.
Our approach involves two stages: data collection and then the reporting and analysis. For the data collection, we showed millions of people videos or pictures after asking them things like their age, level of education or income for example. All this is collated completely anonymously – the data points that are mapped to their faces are what’s recorded on our database.
This data collection part is not the most complex. What is complex is translating these sets of points into emotions. When a vector between two points changes on a face, it can be a tiny fold of the eyelid, a pout, a movement of the head, or an asymmetry. To determine which points mean joy, sadness, confusion, disgust, etc., we work with a team of fifty neuroscientists from around the world. They use our data to fuel their research and in return, they enable us to develop our algorithms. It’s true collaboration on an epic scale, a synergy between the scientific and academic communities and our own business of emotional analytics.
Can you tell us about the innovative aspects of your solution?
What’s innovative in our solution is that it is based on the audience, on what people really feel.
When you advertise, in general, what counts? Typically, it’s what the brand, artistic director, marketing director or agency thinks about it ... but they aren’t the targets! We try to take into account what the target audience thinks, but we rarely ask them for their opinion. And even when we do, this opinion is often expressed in subjective or vague terms: I like it, I don’t like it... We go further, and instead of relying on the declarative, we will look for subconscious responses and are able to visualize, second by second, the curves of the different emotions as they are felt throughout the viewing.
You know that 90% of our behavior is driven by our emotions. For a brand it is important to establish an emotional connection with its customers. The Realeyes solution knows how to measure emotional engagement: we are able to measure the various emotions that drive people when they are exposed to a message, an advertising video - or something else. Thus, by offering users more targeted and content-oriented advertising or content, we make it possible to push content to the right audiences, making it better accepted and more effective.
It is therefore a virtuous circle.
And you will be demonstrating this at the booth at Vivatech?
At Viva Technology, we will present a real-time facial tracking demonstration with a kind of emotional photo booth. Visitors will be able to position themselves and test their own expressions. We will capture the emotions expressed and will display them on the screen as a ratio of happiness, surprise, etc.
Even if it seems a little "gadget"-like, our philosophy is that complex things can be explained in a simple way. We want to spread the word among the public and the market, showing emotional intelligence and what we do in a totally intuitive way.
In the same spirit, we will also present a video that explains our solution.
Do you already work with ENGIE?
Yes, and we also want to show a case study of our work with ENGIE.
We analyzed several videos for ENGIE to see if the messages were understood and appreciated by the audience, and to measure data such as favorability or memorization. For example, for a humorous video, it is important to see if the audiences perceive the humor, by measuring the level of "joy", confirming that it is the dominant emotion, and that confusion or negativity rise to the top. In the same way, we can visualize if the target audiences are more engaged and if the communication is successful. It's absolutely spectacular to visualize the emotions throughout the video; you don’t have to be a data scientist to relate a particular scene, a character, a situation and the emotions that are generated.
Applying emotional analytics is, in my opinion, a very good demonstration of both ENGIE's interest in verifying that its communication works well and the benefits of our technology.
What do you expect to get from participating in Vivatech?
We are counting on VivaTech get exposure for Realeyes, to give a tangible demonstration of real life applications for our solution and to meet companies that might be interested in our work in addition to ENGIE, already using Realeyes to better understand its customers’ expectations with regards to communication using computer vision, machine learning and AI.
Emotions are essential.
And for you the future will be ...?
The future will be focused on the human factor, emotional intelligence and communication will be primarily based on video.
From innovation to commercial success: the ENGIE Innovation ecosystem
Csilla Kohalmi-Monfils is Head of Innovation Ecosystems at ENGIE. Her role is to deploy the ENGIE open innovation strategy, by supporting and animating internal and external innovation ecosystems.I'm interested
ENGIE Joins the Supercritical Transformational Electric Power (STEP) DEMO...
ENGIE is joining in the Supercritical Transformational Electric Power (STEP) project known as STEP Demo, led by the nonprofit research, development and training organization GTI, in Illinois - US.I'm interested
Buoyed By Offshore Projects, Oil & Gas Industry Takes Leading Role In Renewables
It is becoming increasingly clear that oil majors have a prominent role to play in the ongoing energy transition. Facing pressure from consumers, governments and investors, these companies have spent the past decade beginning to decarbonize their existing operations while also ramping up investment into green energy such as solar, wind and hydrogen.I'm interested
Sign up for the ENGIE Innovation Newsletter