Emphatic AI: Becoming Human

Emphatic AI: Becoming Human

One of the most differential trait humans posses is the ability to empathise with each other. Actions, objects, devices, conversations… all have various layers of importance to us according to the meaning we actually give them. What we feel about it clears the path of how we understand it and therefore, how attached we become.

That characteristic is inherent to what we are. We work thanks and despite our emotions. Take the Ikea example. Ikea, the giant Swedish company is  successful due to their unique marketing strategy. Instead of giving us everything done, for example,  they let us take control of the product, from the beginning until the final result. Everybody struggles assembling IKEA furniture, but in that process, we also get attached to what we did. We feel proud we could do it and so we give it a meaning.

IKEA is a good example of how we become more attached to things which are actually ours, customized by and for us. That is what gives things a particular meaning.

Technology and new devices are a good example too. There are thousands of phones out there, sharing characteristics, parameters and hardware, but without the human approach, they are all but the same. Though I can be sure that every phone has its owner and for him or her, it is the only one, it is different.

So far, all technology has been developed following human patterns, trying to become the more human it can. The laptop desktop design follows eye vision common activity; files and storage in computers also reckon brain-style data organization and software HUD are set up according to easily understanding shapes.

But there is one thing technology hasn’t achieved yet, empathy. It might be about to change in the next few years…

Machines will read and respond to human cognitive and emotional states, just the way humans do

AI is the technology to look at

Today, an emerging category of AI—artificial emotional intelligence, or emotion AI—is focused on developing algorithms that can identify not only basic human emotions such as happiness, sadness, and anger but also more complex cognitive states such as fatigue, attention, interest, confusion, distraction, and more.

According to Affectiva CEO, Rana el Kaliouby, “the field is progressing so fast that I expect the technologies that surround us to become emotion-aware in the next five years. They will read and respond to human cognitive and emotional states, just the way humans do.”

Emotion AI will be ingrained in the technologies we use every day, running in the background, making our tech interactions more personalized, relevant, authentic, and interactive. It’s hard to remember now what it was like before we had touch interfaces and speech recognition. Eventually we’ll feel the same way about our emotion-aware devices.”

She and her team are working on a AI emotional driven technology called Affectiva. They have compiled a vast corpus of data consisting of six million face videos collected in 87 countries, allowing an AI engine to be tuned for real expressions of emotion in the wild and to account for cultural differences in emotional expression. Using computer vision, speech analysis, and deep learning, they classify facial and vocal expressions of emotion.

 

Devices with emotions

The problematic here is how and what devices will be the first to be set up with these new AI emotion-aware technologies. Rana el Kalibouly makes a list of applications she is most excited about:

  • Automotive: An occupant-aware vehicle could monitor the driver for fatigue, distraction, and frustration. Beyond safety, your car might personalize the in-cab experience, changing the music or ergonomic settings according to who’s in the car.
  • Education: In online learning environments, it is often hard to tell whether a student is struggling. By the time test scores are lagging, it’s often too late—the student has already quit. But what if intelligent learning systems could provide a personalized learning experience? These systems would offer a different explanation when the student is frustrated, slow down in times of confusion, or just tell a joke when it’s time to have some fun.
  • Health care: Just as we can track our fitness and physical health, we could track our mental state, sending alerts to a doctor if we chose to share this data. Researchers are looking into emotion AI for the early diagnosis of disorders such as Parkinson’s and coronary artery disease, as well as suicide prevention and autism support.
  • Communication: There’s a lot of evidence that we already treat our devices, especially conversational interfaces, the way we treat each other. People name their social robots, they confide in Siri that they were physically abused, and they ask a chatbot for moral support as they head out for chemotherapy. And that’s before we’ve even added empathy. On the other hand, we know that younger generations are losing some ability to empathize because they grow up with digital interfaces in which emotion, the main dimension of what makes us human, is missing. So emotion AI just might bring us closer together.

Our personality expose

As Sci-fi movies show, there is also a dark side on AI emotional technologies. Companies could use it on an abusive manner, as they will know almost everything about  people. So far, they could learn from customers and then use that information for marketing purpose.

Rana el Kalibouly thinks that it is hard to get more personal than data about your emotions. People should have to opt in for any kind of data sharing, and they should know what the data is being used for.

And she sends a warning signal to those who are experimenting with this AI technology. “We’ll also need to figure out if certain applications cross moral lines. We’ll have to figure out the rules around privacy and ethics. And then we have to work to avoid building bias into these applications. But I’m a strong believer that the potential for good far outweighs the bad.”