Siri and Alexa get on their owners’ nerves

0

Compton tried again: “Hey Alexa, play some classical music.”

“Here’s a station you might like,” Alexa said hesitantly, adding that the songs were hosted on Amazon Music.

Americans have welcomed voice assistants into their homes by claiming that Siri, Alexa and Google Assistant will be like quasi-human assistants, seamlessly managing our appointments, shopping lists and music libraries. From 2019 to 2021, the use of voice assistants among online adults in the United States increased from 21% to 30%, according to data from market research firm Forrester. Of the options, Siri is the most popular — 34% of us have interacted with Apple’s voice assistant in the past year. Amazon’s Alexa is next with 32%; 25% used Google Assistant; and Microsoft’s Cortana and Samsung’s Bixby trail behind with 5% each.

(Amazon founder Jeff Bezos owns The Washington Post.)

While usage is on the rise, social media jokes and diner bawls paint voice assistants as automated family members who can’t get much done. The human qualities that have made voice assistants new make us cringe all the more when they fail to read the room. Overconfident, useless and a bit desperate, our voice assistants remind us of the people and conversations we least appreciate, experts and users say.

As Brian Glick, founder of Philadelphia-based software company Chain.io, says, “I’m not adept at using voice assistants for things that have consequences.

Users report that voice assistants are temperamental and often misinterpret instructions.

Talking to them takes “emotional labor” and “cognitive effort,” says Erika Hall, co-founder of consultancy Mule Design Studio, which advises companies on best practices for conversational interfaces. “It creates this kind of work that we don’t even know what to name.”

Take voice shopping, a feature Google and Amazon say will help busy families save time. Glick tried and he is haunted by the memory.

Whenever he asked Alexa to add a product – like toilet paper – he would read a long description of the product: “Based on your order history, I found Charmin Ultra Soft Toilet Paper Family Mega Roll, 18 Count. During the time he spent waiting for her to stop talking, he could have finished shopping, Glick said.

“I get angry just thinking about it,” he added.

Then there are voice assistant personalities. Why does Google Assistant confidently say “of course!” before providing an “incredibly incorrect” response to a request, Compton asked. Why is Alexa always bragging about her abilities and asking if you’d like her to do extra tasks, TikTok creator @OfficiallyDivinity wonders in a video. She accuses Alexa of being a “pick me,” a term for women who are willing to walk over others for approval. The clip has over 750,000 views.

Voice assistants have become punchlines, and their creators are to blame, Hall says. In the rush to install voice assistants in every home, she says companies haven’t stopped to question whether they’re setting expectations too high by portraying such software as human helpers. They also didn’t think about what tasks were smart to do out loud, she said.

For example, shopping online has freed us from unnecessary interactions with human employees, Hall said. With voice shopping, companies have added this friction from the start. No one likes to argue about how many rolls of toilet paper are in a pack.

An Amazon spokeswoman said Alexa may “occasionally highlight experiences or information” that consumers might find useful and they may turn off certain notifications in the Alexa app under Settings.

She said shopping features on Echo devices are “incredibly popular” and usage is growing “significantly” year over year, without providing specific numbers.

She added that understanding of Alexa has improved dramatically, despite increasingly complex user requests.

For its part, Google says it’s investing in language understanding and voice assistant technology to help it better handle nuance and respond in a natural way. An Apple spokeswoman said Siri’s basic functions have improved dramatically over the past few years thanks to advances in machine learning.

But there’s a deeper emotional issue at play, says Compton, who built the AI ​​that powers Twitter bots like infinite cry and gender of the day. In developing voice assistants, she says companies have ignored the often unspoken rules of human small talk. We use small talk to show others we’re on the same page — it’s a quick way to signal, “I see you, and I’m safe,” Compton said. According to philosopher Paul Grice, effective chatter should be four things: informative, true, relevant, and clear.

Alexa comments are often none of those, Compton said.

“Our vision is to make interacting with Alexa as natural as talking to another person, and we’re making several advancements in AI to make that a reality,” the Amazon spokesperson said.

Still, users say the awkward exchanges make us feel bad for Alexa – soulless, trapped in a smart speaker and desperate to be helpful. According to Compton, we also end up feeling bad about ourselves because conversations require two parties and a failure of one feels like a failure of both. Even if the simple difficulty of developing a conversational AI was the only thing to blame, poor interactions with voice assistants also make us feel broken.

“Whenever we talk about any of these things, we feel like we’re being bad,” Compton said.

Share.

Comments are closed.