The Person You Want to Talk to Does Not Exist 

We’ve had automation for decades, but this is different 

By Jennifer Dziura
15 min read

Listen to this post

For at least forty years now, standup comedians have made jokes about automated customer service phone lines. “I tried to change my car reservation, and got asked to press 1 for for English!”, the comedian begins, and you know what’s about to follow – a litany of complaints about getting caught in an endless, unfeeling, irrational loop of questions you can only answer by entering numbers on a phone keypad. 

Even if it isn’t funny, you’ll still follow along because, of course, we’ve all been caught in the machine – for example, listening to a menu of options spoken painfully slowly by a robot voice, which then ends without mentioning your issue. “Press 8 to hear your account balance. Press 9 for information about our green initiatives. Press star to repeat these options.” 

The reason we have all been frustrated for decades is that we want to talk to a person. (What person, exactly?)

There has traditionally been some hope of relief at the end of this process: once we escape from the machine, make our way out of the labyrinth, we will be connected to a human being who will listen to our specific issue, which does not fit into the options presented in the phone menu. We also hold the hope that this person will empathize with our time caught in the machine; we might weakly ask that they pass on our displeasure to their superiors, knowing of course that this will not work but still wishing to have our sad little say.

The System Is Perhaps Not Less Escapable, But We Are Losing People Who Feel the Need to Escape It

The It’s Always Sunny in Philadelphia season 16 finale, “Dennis Takes a Mental Health Day,” features a character in his late forties who struggles with rage issues attempting to navigate customer service issues on three separate occasions. (Spoilers!) In every case, the customer service interface was
1) an automated system
2) staffed by a cheerful young employee
3) who had no interest in circumventing the machine 

First, a bubble tea shop wants customers to download an app to pay, launching Dennis on a classic ‘90s comedian style rant1: “Got apps for everything, you know? Even though I’m already talking to the person that, uh, could just take my order right here, but I got to download an app, right? How about this? I have cash, you have tea. Why don’t we just streamline things?”

It turns out the shop does take cards, but with a $10 minimum that is more than the cost of one tea. As he will several more times this day, Dennis – who can be sociopathically charming when he suppresses his rage – calms himself, plasters on a smile, and attempts to connect with the customer service agent. In the tea shop, he says to the young employee, “It’s not your fault. You know, you didn’t create the system. The system is just… It is what it is, and, yeah, we’re both victims here.” 

She, however, does not view herself as the victim of a system; she has no sympathy for or understanding of his complaint. He gives up and, rather than download the app, orders a second tea he doesn’t want to get above the $10 credit card minimum.

Later, at a mobile phone store that would only sell the phones in the store to new customers while making existing customers wait to receive replacements in the mail, he says to his second Gen Z employee of the day, “Well, I’ll tell you what, pal. I am not mad at you, okay? I am mad at the system. Okay, but unfortunately the system isn’t here for me to direct my frustrations at it—“ 

He collects himself and purchases a second phone line in order to get a device on the spot.

Dennis’s third encounter is for a self-driving rental car that has autolocked him out and won’t connect to his new replacement phone. He spends a long time trapped in a phone maze in which he must enter information via the keypad in order to be connected to a person who then – you know where I’m going with this:

Daisy: My name is Daisy. May I please have your name and -digit VIN?

Dennis: No. No, no, no, no. No. I just ent… I just entered all that information into the system.

Daisy: I’m sorry, sir. I get it, but I do…

Dennis: That-That’s okay. No, no. That’s okay. Daisy, was it? Listen, don’t be sorry, okay? I’m not mad at you. I’m a little upset, but I’m upset at the situation, okay? And-and the situation is this, Daisy. If you’ll just give me a second and indulge me, okay? [SIGHS] My new phone won’t talk to my car. Okay, so I called the car rental company, and they put me on with a supervisor who told me to call Tsuma Roadside […] And you ask me to enter in my VIN and my date of birth again, information which I already entered into your system. Okay, now, now, listen, Daisy, Daisy, bear with me here, okay? I know you didn’t create the system, okay? So you are not to blame. But somebody did create the system, and I would very much like to blame that person. So… who is that?

Daisy: Sir, I’m sorry. Who is…? 

Dennis: Who is the person who created your customer service system in such a way that the information entered into the computer isn’t passed on to the human representative?!

Daisy: Sir, I don’t know. I can send you to one of my supervisors.

Dennis: No, no, no, no, no. No, Dai… no. No, don’t send me to a supervisor, Daisy. Daisy, Daisy…it’s just you and me. Just you and me, Daisy, fighting the good fight against a broken system engineered to drive us both so crazy that we have to take days off for our mental health!

Daisy, of course, does not view herself as fighting the good fight against a broken system. 

All three young customer service agents are in fact too young to know that there was once a world in which machines did not tell a person physically standing in front of another person how to treat that other person.

The machine used to be a tool used by the employee; the employee is now a tool used by the machine.

Speaking to a Person Isn’t What It Used To Be

A lot of older folks – defined as older than me, I suppose – not only talk wistfully about wanting to talk to a person, but will sometimes will go very far out of their way to do so – for example, waiting in a long line when there are kiosks that have no line, or waiting on hold on the phone to do a task that could easily be completed in an app. 

I may also be wistful about talking to a person, but I do not wait in the line or on the phone. 

What old-school people do not realize is that not only can you not talk to a person, if you did, it would not be the person you want, because that person does not exist.

The machine used to be a tool used by the employee; the employee is now a tool used by the machine.

In the vast majority of situations, the person you are permitted to speak to only reads from the machine. They may have a few options available; for example, if you call to cancel your account, a menu may come up on their device instructing them to offer you a particular incentive to stay. You can manipulate this system sometimes – you could spend a day calling all your phone and internet and paid-off credit card accounts and saying you want to cancel and seeing if they offer you a lower rate. If they don’t, just say you changed your mind and hang up. Cute, but this does not solve any actual problems you might have.

The widespread adoption of AI in the last couple of years has made this situation — the dehumanization of both the human worker and the human customer — dramatically worse.

Many websites now offer a chat based support that begins with some questions from a bot or a more sophisticated AI. This itself is not offensive, considering that the alternative is a human customer service agent with little to no decision making power, speaking to many people at once using canned responses they can send to you using keyboard shortcuts. This is just a less efficient way of doing what an LLM can do.2 This person is not the “person in charge” that you ideally would like to speak to.

In a quote that went viral on Twitter in 2020, from a Yellowstone Park Ranger on why it is hard to design a bear-proof garbage can: “There is considerable overlap between the intelligence of the smartest bears and the dumbest tourists.”3 There is a considerable overlap between the customer service you will receive from an AI and a $4 an hour customer service agent halfway across the world speaking to five people at once with boilerplate replies.4

The part that’s new is us. Some of us, anyway. The rapid introduction of AI into Google and every other system has made people quite rapidly dumber and lazier. No one is reading the FAQ. No one contacting support has located the help document and tried the steps before opening the chat. This new type of person is incapable of locating the answer to their question on a webpage dedicated to their exact issue, even if that webpage has a table of contents that helpfully links to various clearly-labeled sections. These new people demand to be read to like babies. Thus, customer service lowers itself to meet us.

If you doubt that a significant number of people were ever this intelligent, consider a B2B SaaS company – for example, a company that provides SMS marketing or advertising tracking or widgets to other businesses. The customers are business owners, presumably able to tell their ass from a hole in the ground. Five years ago, a customer service agent at that company could moan about some bottom percentile of customers who didn’t bother to check the FAQ before clogging up the chat system with questions they could easily have answered themselves. And that customer may have even had the self-awareness that they — well, they could do better.

But just a few years later, nearly everyone begins by opening a chat window and asking to be spoon-fed publicly available information, and no one feels bad for wasting anyone’s time (or wasting water at an AI facility), nor do they feel a little stupid. The Overton window of foolishness and helplessness has shifted. 

The Overton Window, which in this usage would have “non-embarrassing behavior” at the center, slightly embarrassing behavior at +1 and -1, and, indeed, “unthinkable” behavior at the ends

Thus, the job of the human customer service agent — and likely of anyone on that end of the machine — has changed. 

Imagine a brain surgery facility facing a surge of patients who just need advice about not sticking things so deeply into their ears. If this goes on long enough, and no one needs brain surgery, the facility will stop hiring surgeons and begin staffing the place with the lowest level of medical practitioner you can get. The people who are best at explaining very kindly and clearly why you shouldn’t stick things so far into your ear — well, there’s often a tradeoff with other skills, one of them being brain surgery.

The intake process at St. God’s Memorial Hospital from the 2006 film Idiocracy, which while problematic in some respects, is right on the money in many others

The customer service agent no longer moans behind the scenes about people who can’t be bothered to read the FAQ because their job is now actually to read people the FAQ. 

The FAQ is no longer a customer facing document; it’s more of a database for AI and customer service agents using AI to query to send you snippets from. If you’re the customer who has read the FAQ, located this snippet, and still has a question — well, now you’ve gone from being a good customer, one who does a little legwork before bothering anyone, to possibly being perceived as a jackass. (OK, know-it-all, if you already read the FAQ, why are you bothering me, my job is to find you the relevant part of the FAQ.)

The Machine Operators No Longer Operate Machines, But Are Happily Operated By Them

An increasing number of people on social media have reported that doctors and veterinarians are now simply plugging your questions and patient history into ChatGPT 5 6 7

This is expedient, and it’s possible that ChatGPT is more accurate than the average, hurried doctor who doesn’t care very much (again, the quote about the overlap between tourists and bears).

But this practice really shifts the expectations we have of doctors. Will medical schools in ten or twenty years just no longer teach the functions now performed by machines? This doesn’t happen all at once. It’s probably happening now in the sense that medical students feel less obligation to retain information they know ChatGPT will retain for them. Perhaps some practicing physicians feel less responsibility to engage in continuing education (or to pay attention when they’re required to attend it) when the information is already in the machine.8 

The price we pay for expedience is that we become worse. It’s difficult to imagine some future, fully enervated ChatGPT doctor springing to action, regaining their full intelligence when you come in with a problem ChatGPT can’t solve. A doctor selected to be the face of an artificial intelligence isn’t going to turn off the machine and transform into Dr. House. If the machine doesn’t have the answer, there will be no humans to appeal to.

It was once common that you would got to a bank or a store, and a person who worked there would derisively wave away the output of some machine. “Oh, the computer always does that,” or “The register doesn’t want me to ring up two coupons, but….” The expectation was not only that most employees knew more than the machine, but that they had the power to ignore the machine.

A customer is helped in a New Jersey pharmacy, 1994 (LOC)

The last time I can recall I spoke to a person who knew more than the machines was, again, in a business to business environment (regular consumers usually don’t stand a chance) – my business is tied together with a legacy software product, and I called on the phone and spoke to someone who did not primarily work in customer service – a person who makes and maintains the product and takes a call every now and then. A person who knows more than the documentation being queried by the AI. I had an old-world experience.

There are fewer and fewer places where there is someone in charge who knows more than the machines they use, and where you can interact with this person.

There was a time that information was canonically located in people, and machines were a way for those people to connect from afar. 

Even in the case of a database where the sheer quantity of information could not be located in a person, the meaning behind that information and the authority to make decisions from it were located in people. 

The locus has changed, and the young people do not seem concerned. 

In the It’s Always Sunny in Philadelphia episode, Dennis is at one point made to spell his name to an automated phone system:

“D” as in “Deliver me from this!”

“E” as in “Engage with human.”

“N” as in “nightmare”!

“N” as in “nightmare”!

“I” as in “Is this real?!”

“S” as in “Somebody help me!”

The episode – most of which turns out to simply be a middle-aged man’s fantasy – ends with him eating the heart of the CEO of the company that created the car, and by extension the phone system. 

Even this is fantasy, because when no humans are in charge, there are no hearts to eat. 

Endnotes

  1. “I bought a doughnut and they gave me a receipt for the doughnut; I don’t need a receipt for the doughnut. I’ll just give you the money, and you give me the doughnut, end of transaction. We don’t need to bring ink and paper into this. I just can’t imagine a scenario where I would have to prove that I bought a doughnut.” -Mitch Hedberg ↩︎
  2. In the 2013 film Her, in which Scarlett Johansson voices an AI that the protagonist falls in love with, this protagonist asks “Do you talk to someone else while we’re talking?” and “Are you talking with someone else right now?” He is horrified to discover that she is, at that moment, talking to 8,316 other people, and in love with 641 of them. ↩︎
  3. https://x.com/mathematicsprof/status/1270119912418181120 ↩︎
  4. Over the last few months, when opening up a support chat, I have started openly asking “Are you a human or an AI?” Sometimes I am curious. Sometimes it’s genuinely an important question because I need to know if the entity I’m speaking to can visually see something with their eyes such that they might perceive the problem in my screenshot. Sometimes it is meant as a complaint, although I don’t know if it’s being received as such. A human agent being asked “Are you an AI?” could possibly consider it a compliment based on their speed or accuracy. A human asked to do a job better done by an AI is likely to be found wanting. ↩︎
  5. https://x.com/mayankja1n/status/1918850570711962045 ↩︎
  6. https://x.com/Jonas_Vollmer/status/1925494927435219293 ↩︎
  7. https://x.com/owenbroadcast/status/1932086328294047795 ↩︎
  8. Some of you may be old enough to remember math teachers insisting you learn mental math because “When you’re an adult, you won’t always have a calculator in your pocket!” Well, aren’t we the finest flower of humanity. ↩︎
Author Bio: Jennifer Dziura

Jennifer Dziura founded GetBullish in the 2010s to provide a feminist take on careers, entrepreneurship, and boldly managing personal and professional relationships. Since then, the blog has spawned a gift store offering thousands of items, including hundreds designed by Jen herself. Jen studied philosophy at Dartmouth College, later received a masters degree in education, and is a former standup comedian and adult spelling bee host. She has spoken at Yale, Harvard, and many other universities and conferences and organized six Bullish Conferences. She lives in Brooklyn and writes from a planty corner of the GetBullish warehouse.

Jennifer Dziura

Want to support GetBullish? Send a gift card to a friend or buy one for yourself that you can spend later on your holiday gift shopping.