Ir directamente al contenido principal

Article 9 min read

Pamela Pavliscak on the danger and promise of emotional technology

Por Susan Lahey

Última actualización el September 21, 2021

Silicon Valley is trying to crack the code on emotion. The tech world knows that research reveals humans are more controlled by emotion than we like to think. We throw up a bunch of logical-sounding rationalizations for why we do what we do, but often those rationalizations are mere camouflage for the fact that we were moved by emotion. So Silicon Valley is getting serious about programming technologies to understand, interpret, and respond to our emotions—either to help us out, or to get us to buy stuff, or both. And while that has the potential to increase human empathy across the board; it can also be really scary.

As Pamela Pavliscak, author of Emotionally Intelligent Design: Rethinking How We Create Products told me, “We have seen companies who already have access to our personal data not being especially responsible and respectful to that data. Now the same people want access to our emotions. How can we trust that that’s serving our interest and not theirs?”

Pavliscak (pronounced Pavlishock) is an expert on the relationship between tech and emotion. The founder of Change Sciences and a faculty member at Pratt Institute School of Information, began studying the relationship years ago when she was consulting on the design of tech products and noticed a direct correlation between the proliferation of tech and a growth in depression. Her research has revealed that the emotional relationship between humans and technology is both complicated and largely unacknowledged.

Her research has revealed that the emotional relationship between humans and technology is both complicated and largely unacknowledged.

For example, in her talks, she calls attention to how people feel when—in the midst of an emotionally charged conversation—they watch their phone while those three little dots undulate, communicating that the other person is typing. What is the name for that emotion? And what about the emotion that arises when the dots suddenly disappear? Humans feel emotional when a bot tells us we haven’t passed the test it gave us and so we can’t access our own data. And there are more ambiguous and complex emotions that can manifest when you’re taking a picture of something magnificent and suddenly realize it has been photographed a million times before. We don’t have labels for many of these emotions, much less strategies to cope with them. Nonetheless, companies are racing to exploit the connection without fully understanding it.

Pavliscak doesn’t fear that this will necessarily create a dystopian nightmare. Nor does she lean too far the other direction, believing that the tech world can claim to be the champion of universal empathy. Rather, she stresses that emotion is multifaceted, layered, and nuanced, and humans don’t even fully understand the nature and origins of the spectrum of emotion that we experience. And artificial intelligence (AI), while great at recognizing patterns, is not so great at navigating nuance and ambiguity.

[Related read: 5 ways to bring your human(ity) to work]

“Emotion is probably one of the first things we know about as a person and it’s one we never master,” she said. There’s the physical aspect of emotion, how emotion registers in our faces and manifests in our brain chemistry, our pulse, and perspiration which is much of what emotional AI focuses on. But emotion is much bigger than that. “We’ve been looking at the history of emotion, how it changes over time and how that affects our language and our perception of emotion,” Pavliscak said. “Then there the other layers of emotion, your personal history, your culture…. Then there’s constructed emotion: how emotion accrues meaning as you go through life and make connections. AI is getting at one tiny piece of one aspect of emotion and it does it very imperfectly.”

“We’ve been looking at the history of emotion, how it changes over time and how that affects our language and our perception of emotion.” – Pamela Pavliscak

Research backs her up. One recent study pointed out that a lot of assumptions are programmed into AI to register facial expressions. But a smile doesn’t always mean happiness. Sometimes it means submission. Sometimes it’s hiding anger. AI often can’t tell the difference.

Using tech to “Tell me who I am”

There’s a cultural narrative around the omnipotence of tech. Without acknowledging the limitations of AI, many articles promise that emotional tech will “understand us better than we understand ourselves.” Elon Musk claims humans are a “biological boot loader” for AI. “It’s as if humans are the problem and technology is the solution,” said Douglass Rushkoff, author of Team Human—one voice warning of the risk of deifying tech.

“There’s this belief that ‘machines will be able to understand this and sort it out,’” Pavliscak said. But while humans teach tech to understand [us], and tech can reflect back the patterns it’s seeing, tech’s reflection is incomplete and leaves out “a huge part of what makes us human.”

A lot of technology is created or designed to solve problems, Pavliscak said, but she noted that emotion and meaning behind our emotions are not problems to be solved. “If we take the approach that they are, tech is going to fail us in a big way.” In fact, emotion can often flag a problem that does need to be solved. “One thing AI can do is understand patterns better than humans do, and tech has something to teach us in that,” she said.

But assuming tech can understand our inner lives at large and serve us up a definition of who we are is reductive, Pavliscak said, noting that each engineer, designer, investor, and programmer comes to emotional tech and AI projects with individual agendas and goals.

[Related read: Build a strong company culture by leading with EQ]

Emotional tech can be awesome

The salient question is, what is the goal of adding an emotional component to technology? Emotional tech can be enormously helpful. Sophisticated technologies are being developed to let your car help manage your road rage or fatigue to reduce accidents; help trauma victims reprocess their experiences; help people on the autism spectrum understand the emotions of those around them; and alert people wearing specific devices about potential mood shifts that they would want to head off—like stress or depression. More commonly, AI in customer support bots or IVRs can detect when a customer is getting emotional and whether they should move the issue to an actual human. AI can detect a tone in the language of an email you’re composing and let you know what the email communicates before you hit send. Emotional tech could serve to reduce conflict and increase empathy.

What is the goal of adding an emotional component to technology?

But what about tech designed primarily to use our emotional data to get us to buy stuff?

“A lot of tech was designed for engagement, to capture time and attention,” Pavliscak said. “It was designed with the questions in mind: ‘What causes people to spend all their time doing this? What draws the most attention?’” she said. “And it turns out that most of what draws our engagement is outrage, and schadenfreude, and loneliness. And that puts design into the mix. If designing to keep people engaged means designing for negative emotions, well, I think that’s going to be the next tide turn.”

By tide turn, Pavliscak means a new approach to our how technology is designed to engage with emotions. The world is just beginning to recognize and wrestle with the fact that our growing problem with loneliness and depression may be directly related to the impact of social media on humans’ brain chemistry.

Chamath Palihapitiya, former vice president of User Growth at Facebook, put it in no uncertain terms to an audience of Stanford students: “The short-term, dopamine-driven feedback loops that we have created are destroying how society works.” The neural circuitry used by slot machines and cocaine to keep us using their products as much as possible are also used by social media.

For many, the focus has shifted toward not just emotion, but mental health. Tapping into people’s emotions can either help or hurt their mental health.

[Related read: 7 innovative ways technology is transforming the patient experience]

“The conversation is more and more around mental health,” Pavliscak said. “There are groups that are trying to think about ways to hack mental health, reduce anxiety… there is lots of interesting work, both with and without emotional AI, looking at mental health and therapy from all kinds of different angles. That’s on the radar for me right now. One of the first steps toward making some positive changes involves conversations around ethics. It touches on everything.” To that end, Pavliscak is a member of an international committee focused on creating standards for ethical AI.

For many, the focus has shifted toward not just emotion, but mental health. Tapping into people’s emotions can either help or hurt their mental health.

Forming emotional bonds with technology

If technology is affecting the bonds we form with each other, ethics will become even more important as humanoid bots are developed. “We develop emotional bonds with technology, and when it’s embodied those bonds become even stronger,” she said. “Then it becomes even more crucial to respect humanity.”

Ultimately, she thinks regulation will be necessary, because it usually is. Would auto manufacturers have universally installed seatbelts if there hadn’t been regulation? Probably not. And regulation is usually the product of consumers pushing back against practices that they find dangerous or exploitative.

“We develop emotional bonds with technology, and when its embodied those bonds become even stronger. Then it becomes even more crucial to respect humanity.” – Pamela Pavliscak

There are some people who believe emotion has no place in technology. But, as Pavliscak pointed out, we have a very emotional relationship with tech whether or not it’s intended. The need to understand and cope with our emotions, rather than just distancing ourselves from them, is a lifelong pursuit. People often fear they will be overwhelmed by their emotions, or manipulated through them, she said.

[Related read: Nadine Champion on truth, fear, and learning to take a punch]

“We spend our entire lives learning to understand, recognize, enable, discourage, or regulate our emotions. That’s the concept behind emotional intelligence; we’re constantly learning it,” she said. “It’s not because emotions are bad. I’ve come to believe there aren’t good or bad emotions. Sometimes there are negative emotions, and sometimes that’s something you want to embrace for a while. In American culture we have ideas about bad or good emotions that aren’t helpful. For example, men aren’t allowed to show or have certain emotions…that’s what the culture accepts and teaches. The danger of all of this is the over-simplifying and not paying attention to it.”

But if we do pay attention, and employ ethics conscientiously, there is a lot of promise to go along with the potential risks, Pavliscak said. “In looking at mental health, maybe there’s something missing there. Perhaps there is a way of combining human and machine intelligence that can help.”

Relatos relacionados

Article
6 min read

Science-based targets are the key to sustainable business

To help combat climate change, many companies are setting science-based emissions reduction targets. Learn more about these efforts and the impact they can have on the planet.

Article
5 min read

Here's how customer service teams are actually using AI

From bots to automated workflows under the hood, Generative AI tools for customer service driving higher productivity, happier agents, and satisfied customers

Article
5 min read

That’s a wrap: A look back at Zendesk Relate 2022

We loved seeing you at Zendesk Relate 2022. Here are a few highlights—and a peek into what you can still explore online.

Article
6 min read

We’re placing some bets on the future of customer experience

Join us at Relate to hear our five big bets on what the customer experience will look like by 2030.