Skip to main content Skip to Footer

BLOG


June 26, 2018
Emotion tracking: Driving trust in AI
By: Naomi Nishihara

Emotion tracking: Driving trust in AI


Following several high-profile crashes, 73 percent of American drivers now report they would be too afraid to ride in a self-driving car, up from 63 percent in 2017. Additionally, 63 percent of U.S. adults say that they would feel less safe walking or biking with self-driving vehicles on the road. Even though autonomous cars are ultimately expected to make roads safer and more efficient, accidents bring the technology under scrutiny, and shake public opinion.

Consumer trust in self-driving technology is falling. Does emotion-tracking offer a solution? #AI accenture.com/trustAI

 
 

With enterprise success for self-driving cars dependent on widespread acceptance, companies must work to rebuild public trust. One of our 2018 Tech Vision trends, “Citizen AI,” highlights the need for transparency, as well as explainability, in AI systems (like those in self-driving cars)—recognizing that without trust, even the best technologies can’t succeed. In this year’s Technology Vision survey, 72 percent of executives reported that their organizations seek to gain customer trust and confidence by being transparent in their AI-based decisions and actions.

One Silicon Valley startup is adding another facet to their efforts to build trust: Renovo Auto has partnered with Affectiva, an AI startup, to build a self-driving car system that can identify human emotions. In the 1970s Paul Ekman, a pioneer in the study of facial expressions, linked thousands of muscle movements, like the wrinkle of your nose or the corners of your eyebrows, to emotions. Now Affectiva monitors passenger and pedestrian emotions by using computer vision algorithms to identify those movements. They are analyzed by deep learning algorithms to classify facial expressions, which are mapped to emotions.

A lot of human communication happens nonverbally—through facial expressions, eye contact, posture, and other kinds of body language. Emotion tracking could improve the autonomous or semi-autonomous driving experience by letting the car respond to those nonverbal cues: reminding inattentive drivers to watch the road, slowing down if passengers look scared, or scanning for pedestrian expressions to have a better idea of what’s happening outside the car. Having insight into human emotions as people interact with the technology could help companies determine how to make people more comfortable with self-driving cars, and let them design vehicles to maximize consumer trust.

And emotion-tracking could be used for much more than self-driving cars. MindMaze, a virtual reality (VR) startup, has developed MASK, a device that can read the emotions on your face while you’re in VR. According to TechCrunch, the company hopes to commercialize the interface between brain and VR, and also believes the MASK could be a valuable medical tool. Several tech giants are exploring the technology as well: Apple acquired Emotient, a company that developed emotion-tracking AI; Microsoft offers an API for tracking emotions; and Facebook has several patents for emotion-tracking, including one called “Techniques for emotion detection and content delivery,” which includes a flowchart for tracking user emotions through smartphone cameras. Tying emotions to the type of content a user is viewing could influence the type of content Facebook chooses to show in the future. For instance, they could help PTSD patients avoid triggering videos or images.

Of course, companies will need to balance the use of emotion-tracking technology against privacy concerns, as consumers are increasingly wary about how their data is used—and this is data at a very personal level. But with the right balance of access and innovation, emotion tracking can make products more effective and efficient—and help companies build trust in their technologies.

To learn more about “Citizen AI” and other trends, check out the 2018 Technology Vision.

Popular Tags

    Archive