Your Next Purchase Could Have a Personality - OZY | A Modern Media Company

Your Next Purchase Could Have a Personality

Your Next Purchase Could Have a Personality

By Renee Morad


Because the objects around you will soon be able to read your emotions. 

By Renee Morad

Imagine a world where technology and emotional intelligence collide. When you log off at the end of the workday, the computer sends a message: “You seem a bit sluggish. Need an espresso?” You nod, and climb into your car to go home in a matter of minutes (your car leverages its database, which stores your driving habits and traffic patterns, to find the most efficient route). Once home, that espresso is already brewing — your computer sent a command and your car relaid your ETA. 

Hold on tight as we take a joyride into the future of emotional intelligence, the newest consumer trend that’s picking up momentum and leaving basic artificial intelligence in a cloud of dust.

“We’re going to see a continual evolution of products that try to react more appropriately with people,” says Richard Korf, a professor of computer science at UCLA. This new wave of technology, he explains, will specialize in “remembering your actions and responding in a way that’s most helpful to you.” Envision cars equipped with cameras that can sense road rage and play calming music to soothe the agitated driver, or robot-like personal assistants that can tell by tone of voice when “the boss” is irritable — and promptly offer suggestions to remedy a bad mood. In time, these empathic consumer products will know us better than our closest friends.

“We’ve spent the past 20 years digitizing our devices and our physical environments with smart home features and a connected lifestyle,” says Shawn DuBravac, chief economist of the Consumer Technology Association. “The next logical transition is to digitize our senses and our emotions.”

The next logical transition is to digitize our senses and our emotions.

Shawn DuBravac, chief economist, Consumer Technology Association

In a keynote at the Consumer Electronics Show in Las Vegas this January, Jen-Hsun Huang explained the main real-world applications that will rely on artificial intelligence.

“We are going through, unquestionably, the most exciting time in tech,” said Huang, CEO of Nvidia, a tech company partnering with Audi to introduce artificially intelligent cars to the roads. “What we thought was science fiction is coming true as we speak.”


With the adoption of graphics processing unit acceleration and deep learning, Huang anticipates a future where autonomous cars will respond to the driver’s mood when, say, road rage is displayed by playing the driver’s favorite classical music. Soon, cars will also be capable of checking the family calendar, picking up the kids from after-school activities and returning to the office so the whole family can ride home together.

Automobiles aren’t the only products in the driver’s seat of this futuristic trend. Emotech’s robotic personal assistant Olly, created by a group of neuroscience and machine learning scientists, evolves to become more like its user over time. Meanwhile, the developers behind Amazon’s smart speaker Alexa say they’re striving to make the device even smarter by potentially tackling emotions and intonation, according to Geekwire.

Amazon Echo

Amazon’s Alexa may one day be able to read users’ emotions from their intonation. 

Source Amazon

There are also emotionally attuned products designed to keep stress levels down. The Muse brain-sensing headband, for example, measures electrical signals from the brain and, through carefully selected audio prompts, uses that information to help users practice mindfulness. The watch-like Doppel emits a vibration against the inside of the wrist that feels like a heartbeat, exploiting the body’s natural response to rhythm to augment mood. The concept is fairly simple: A slower, simulated vibration should slow a user’s heartbeat and calm her, while a faster rhythm should energize.

While emotional intelligence is moving quickly to the forefront, some obvious challenges stand in the way of full-scale implementation. For starters, companies need to home in on accuracy of measurement to ensure the technology is correctly interpreting human emotion, which, as DuBravac notes, can be pretty complex. He also says people might be turned off by emotional tech that’s “on” all the time; someday we might crave alone time, away from the empathic products that surround us. For now, though, many companies are betting big that consumers won’t be put off and will value tech that truly “gets” them.  

Sign up for the weekly newsletter!

Related Stories