Imaginary jobs at fake firms: living in an age of illusion
Information technology can reduce the transaction costs of using markets. For labor markets, however, the costs have merely been transferred from business to workers....
- Audio available
by Tomoko Yokoi Published 28 April 2021 in Technology ⢠4 min read
The simple technology that reads facial expressions may be great for business, but there are ethical concerns as well as question marks about its reliability
My facial muscles were getting a good workout, as I contorted them into different expressions. I thought I was putting on my âhappyâ face, but my facial expression was interpreted as being fearful. I next tried to hiss like a cat, showing my teeth and scrunching up my nose. This expression was interpreted as being angry. I spent the next five minutes trying to express the six basic emotional states: anger, contempt, happiness, surprise, disgust, and fear.Â
I was exploring a website, emojify.info, which was launched this month by researchers hoping to raise awareness and encourage dialogue about emotion AI, an emerging technology that is raising alarm amongst ethicists while whetting the appetite of businesses and investors. âWe need to be having a much wider public conversation and deliberation about these technologies,â DrAlexa Hagerty, project lead and researcher at the University of Cambridge told The Guardian newspaper. âIt claims to read our emotions and our inner feelings from our faces.â
While humans might currently have the upper hand on reading emotions, machines are gaining ground using their own strengths. Emotion AI seeks to learn and read emotions by decoding facial expressions, analyzing voice patterns, monitoring eye movements, and measuring neurological immersion levels. The field dates back to at least 1995, with the work of MIT professor Rosalind Picard, and is starting to show promise with real use-cases.
Businesses are starting to explore emotion AI to improve customer and employee experiences, and analysts are projecting this technology to generate a $87bn market. Its ease of use, given that it can be done at a distance with nothing but a camera, could help businesses anticipate what type of products and  services could be offered to consumers. One obvious use is in market research to assess how consumers react to advertising.
Market research agency Kantar Millward Brown records facial expressions footage and analyzes their expressions frame by frame to assess their mood. âYou can see exactly which part of an advert is working well and the emotional response triggered,” Graham Page, managing director of offer and innovation at Kantar Millward Brown, told the BBC.Â
Increasingly, emotion AI has moved beyond advertising and is being used in situations such as job hiring, airport security and even education. During the COVID-19 pandemic, some schools in Hong Kong leveraged emotion AI to gauge whether students remained engaged in doing their homework. While they were studying at home, the AI measured studentsâ facial muscle points using their computer cameras to identify their emotions. Combined with other data points such as how long students took to answer questions and their prior academic history, the AI program generated a report that included a forecast of their grades.Â
âThe paradigm is not human versus machine, itâs really machine augmenting human,â explained Rana el Kaliouby, CEO of Affectiva, in an interview with MIT. But experts are skeptical as to who is really in control.
Take the example of Cogito, an artificial intelligence program designed to help customer service agents, by listening to the tone, pitch, word frequency, and hundreds of other factors. When it hears a strained tone, the program sends a message to the agent with a pink heart, âEmpathy cue: Think about how the customer is feeling. Try to relate.â Â Machines using emotional AI are no longer just attempting to interpret human emotion, they are also starting to simulate those emotions and, in turn, prodding humans to be more empathetic.Â
Cogito executives describe their software as a type of coaching software, which TIME journalist Alejandro de la Garza, describes as âunsettling.â As de la Garza points out, âIf AI can âcoachâ the way we speak, how much more of our lives may soon be shaped by AI input? And what do we really understand about the way that influence works?â
As much excitement as there is regarding the technologyâs potential, there exists an equal amount of concern.
Emotions are complex. And scientists say that it is impossible to judge how someone feels by their facial expressions. After reviewing more than 1,000 studies, researchers concluded that the relationship between facial expressions and emotions was nebulous. In their report, the scientists identified three key shortcomings in the science of tracking emotions on the face:
1 Limited reliability: someone may be scowling because they are tired, rather than feeling angry.
2 Lack of specificity: there is no unique mapping between a facial expression and a category of emotion.
3Lack of generalizability: the effects of culture and context are not taken into account.Â
The science of emotions and facial expressions, on which many emerging commercial applications are based, is controversial. In addition, like other forms of facial recognition technologies, emotion AI is encoded with bias. A study has revealed that emotion AI assigns more negative emotions to black menâs faces than those of white men. . Furthermore, it raises questions about privacy and mass surveillance, as well as its appropriate use in high stakes situations such as law enforcement.
Emotion AI has the potential to impact our lives in profound ways. Still in its relatively nascent stages, many are advocating for the public to get informed and get involved to help steer the discussion. Should this technology be used and how? Dr Hagerty, who launched the citizen science project emojify.info  shares a simple call âto action from digital activist Joy Buolamwini: âIf you have a face, you have a place in the conversation.âÂ
Researcher, TONOMUS Global Center for Digital and AI Transformation
Tomoko Yokoi is an IMD researcher and senior business executive with expertise in digital business transformations, women in tech, and digital innovation. With 20 years of experience in B2B and B2C industries, her insights are regularly published in outlets such as Forbes and MIT Sloan Management Review.
20 September 2024 ⢠by Jerry Davis in Technology
Information technology can reduce the transaction costs of using markets. For labor markets, however, the costs have merely been transferred from business to workers....
23 August 2024 ⢠by Luc de Brabandere , Jonas Leyder, Lina Benmehrez in Technology
In today's fast-paced world, guidance is needed not on what to think but how to think....
13 August 2024 ⢠by Julia Binder in Technology
The legacy of these Olympics extends far beyond the final medal count, providing a model for how global events can contribute positively to both the environment and society....
26 July 2024 ⢠by Jerry Davis in Technology
The rise of electric vehicles â and a flood of new, small market entrants â is transforming the nuts and bolts of car making. Does this shift finally spell the end for...
Explore first person business intelligence from top minds curated for a global executive audience