Sponsored By

AI Replacing Humans? Not in these CasesAI Replacing Humans? Not in these Cases

It’s not surprising that two of the first high-profile battles over AI in labor relations come from fields in which the human element is an absolutely critical part of the work.

Eric Krapf

June 1, 2023

3 Min Read
Chatbot Before It All Goes Wrong

It didn’t take long for the human-vs.-chatbot/AI battle to hit the mainstream, most recently with the story of the National Eating Disorder Association (NEDA), an organization that made news when it announced it was replacing its paid helpline staff with a chatbot named Tessa--only to take the chatbot offline amid concerns about the responses it was giving.

The helpline staff pointed out that they were told of their firing shortly after they decided to unionize; NEDA countered that its move came as a result of concerns over legal liability. Then, shortly after the issue came to light in national media coverage, some users began publicizing messages they’d received in chats with Tessa that seemed to encourage harmful behaviors rather than trying to mitigate them. NEDA then took the chatbot down.

NEDA's story brings to light many of the issues that contact centers will be facing as the pressure grows for wider use of AI, and its worth noting that the NEDA incident is happening at the same time as the Writers Guild of America strike, where concerns over AI’s potential to replace creative workers have been a central issue. It’s not surprising that two of the first high-profile battles over AI in labor relations come from fields in which the human element is an absolutely critical part of the work.

The experience with Tessa reinforces the concern that AI can’t provide the kind of empathy and insight that someone in crisis would hope to receive when they reach out to a hotline.

In the case of the writers’ strike, since generative AI is derivative by its nature, it would seem the battles over such systems’ use of intellectual property are just beginning. AI may not be able to produce creative or insightful or profound original works of art, but there are those who’d say that’s not necessarily a requirement for much of our entertainment today anyway—which is why the writers are worried.

One minor point, which leads to a bigger one about how end users are being encouraged to look at AI: It’s become common parlance to say that generative AI has a tendency to “hallucinate.” When I asked the speakers on a recent webinar what exactly it means for AI to hallucinate, the answer was, basically, it gives the wrong answer.

Saying that a computer “hallucinates” anthropomorphizes it, which is both inaccurate and dangerous. Machines can’t hallucinate because they have no consciousness, no matter what anyone wants you to believe. You’re not communicating with a human equivalent. Saying that an AI chatbot hallucinates is like saying your calculator hallucinated if it told you 2 plus 2 equals 5.

Companies already invariably give their chatbots human names in hopes that users will believe they’re getting human-quality service. There’s no hope of reversing this convention, but the experience with NEDA and its Tessa chatbot demonstrate that the desire to humanize technology can backfire when people expect the tech to offer human-quality, contextually appropriate interactions, and it can’t.

How will all of this impact IT? Influence over the customer experience (CX) strategy was already moving into the domains of marketing and customer service, and as technology becomes more tightly interwoven into CX strategy, IT organizations will be pulled more strongly than ever into new areas such as compliance and HR. Technology may prove to be the easy part.

About the Author

Eric Krapf

Eric Krapf is General Manager and Program Co-Chair for Enterprise Connect, the leading conference/exhibition and online events brand in the enterprise communications industry. He has been Enterprise Connect.s Program Co-Chair for over a decade. He is also publisher of No Jitter, the Enterprise Connect community.s daily news and analysis website.
 

Eric served as editor of No Jitter from its founding in 2007 until taking over as publisher in 2015. From 1996 to 2004, Eric was managing editor of Business Communications Review (BCR) magazine, and from 2004 to 2007, he was the magazine's editor. BCR was a highly respected journal of the business technology and communications industry.
 

Before coming to BCR, he was managing editor and senior editor of America's Network magazine, covering the public telecommunications industry. Prior to working in high-tech journalism, he was a reporter and editor at newspapers in Connecticut and Texas.