Introduction
We know what is Artificial Intelligence, but do we know what Artificial Emotional Intelligence or Emotional AI is all about? Watching the sci-fi movie Ex-Machina, we catch a glimpse of the future of Emotional AI where robots are programmed to react to and analyse human emotions, and learn from it to become more humanlike, and depending how you view the ending, that can be scary, but we’re not there yet. Right now, we’re still in the beginning stage, scratching only the surface of this huge potential. So what can marketer learn about it to leverage their brand?
What does Emotional AI achieve?
Firstly, what is Emotional AI? It refers to the sets of broad ranged technologies aimed at automating objective measurement of opinions, feelings, and behaviors, relying on natural language processing (NLP) and natural language understanding (NLU), and together with modern psychology, to draw information on human opinions and feelings. As time goes by, it involves both face- and voice-recognition technologies to analyse the tone, facial expressions and mood of the speaker.
In a market that is estimated to grow to $41 billion in 2022, giants like Google, Apple, Facebook and Amazon, more commonly known as GAFA, are already working actively on emotional AI. If your devices know you, they could speak, described and impact you in better ways than you know. This is the kind of humanisation of machine we see constantly in sci-fi movies. Imagine what a virtual influencer or chatbot can achieve, if it knows what you FEEL, what you THINK, and how you REACT to an event, by using your microphone, webcam and deep learning. Even the best salesman cannot have such important data about you.
But there are still many dangers to using emotional AI in marketing :
- Firstly collecting emotion is a very intrusive experience, since it is one of the most personal data that anyone could collect. Does my customers know, and agree, to let me use and collect their facial movements by using their camera, or to record their voice, easily with connected speakers like Alexa or Google Home? And if you also cross data with geolocalisation (someone goes to a shrink every Friday, and to the hospital every Saturday, to theatre to watch funny movies etc…), easily collected with smartphones, connected watches and many other devices, you could really know and understand what your target have in mind. It could be really frightening, mostly if you think of it in political, ethical, religious perspective. It could be a powerful tool to manipulate people that if you use it after a shocking news like a terrorist attack, an earthquake or a grave accident, you could really affect people’s minds.
- Another sensitive point to note is that if your process does not interpret emotions well enough, it could be a disaster for your brand, especially for a new product. The power of the first impression is so important for a new brand, you can imagine the impact of serving a humourous advertisement to a grieving prospect at the wrong time, or a frightening, intense message to a happy one, you ruin your image and message.
- And lastly, are systems robust enough to analyse and react instantly to a collected emotion? Is it not too dangerous, if we take into consideration the previous 2 points, to interfere with a customer’s mind without much data collected about him? That is why many companies are already working on less sensitive applications, probably to collect and test the best way to use emotions, some of the examples can be seen in the following:
- Companies that wants to know how people react to a new advertisement
- Conversational instruments like chatbots that speak, translate and interact with people in an intelligible and human way
- Emotional AI that detect human emotions like anger in drivers helps prevent accidents
- British and Chinese governments, with different goals in mind, are using tools to understand, collect and analyse their citizens’ sentiments about policies and information
- To control stress in employees, and to prevent them from taking the wrong decisions, companies are using bracelets to control and improve their stress level
Conclusion
Emotional AI can be a scary and intrusive experience that we might not be ready to embrace fully at this current stage. But using this technology can also yield its benefits. Having a safety application to help people in their daily occupation will lead them to accept systems to collect data about them. It is also a way to improve human aptitude and accuracy with devices. AI Friendship is an ongoing research to create a link between systems, services, devices, brands with users, and it also improves a technology addiction. Since people are afraid of IT, why not have an AI friend who know what you feel and can help you to navigate your life? Right now, the most important part is to lead people to trust in emotional AI, find a natural use of it. In that sense, empathy will probably be the clue in this revolution.
Resource
https://medium.com/@cauraco/emotional-ai-what-is-it-ee3e63678679
https://hbr.org/2018/07/3-ways-ai-is-getting-more-emotional
https://www.forbes.com/video/5850986784001/#5a5892d54067
https://atelier.bnpparibas/en/prospective/article/emotional-ai-coming-soon
https://emerj.com/ai-podcast-interviews/can-businesses-use-emotional-intelligence/
https://www.bbvaopenmind.com/en/ai-systems-dealing-with-human-emotions/
https://www.gartner.com/smarterwithgartner/13-surprising-uses-for-emotion-ai-technology/