this post was submitted on 07 Oct 2023
342 points (96.5% liked)

Technology

59197 readers
2953 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 119 points 1 year ago* (last edited 1 year ago) (1 children)

For really useless call centers this makes sense.

I have no doubt that a ML chatbot is perfectly capable of being as useless as an untrained human first level supporter with a language barrier.

And the dude in the article basically admits that's what his call center was like:

Suumit Shah never liked his company’s customer service team. His agents gave generic responses to clients’ issues. Faced with difficult problems, they often sounded stumped, he said.

So evidently good support outcomes were never the goal.

load more comments (1 replies)
[–] [email protected] 82 points 1 year ago* (last edited 1 year ago) (5 children)
  • works 24/7
  • no emotional damage
  • easy to train
  • cheap as hell
  • concurrent, fast service possible

This was pretty much the very first thing to be replaced by AI. I'm pretty sure it'd be way nicer experience for the customers.

[–] [email protected] 68 points 1 year ago (7 children)

Doubt. These large language models can't produce anything outside their dataset. Everything they do is derivative, pretty much by definition. Maybe they can mix and match things they were trained on but at the end of the day they are stupid text predictors, like an advanced version of the autocomplete on your phone. If the information they need to solve your problem isn't in their dataset they can't help, just like all those cheap Indian call centers operating off a script. It's just a bigger script. They'll still need people to help with outlier problems. All this does is add another layer of annoying unhelpful bullshit between a person with a problem and the person who can actually help them. Which just makes people more pissed and abusive. At best it's an upgrade for their shit automated call systems.

[–] [email protected] 26 points 1 year ago

Most call centers have multiple level teams where the lower ones are just reading of a script and make up the majority. You don't have to replace every single one to implement AI. Its gonna be the same for a lot of other jobs as well and many will lose jobs.

[–] [email protected] 9 points 1 year ago (2 children)

I know how AI works inside. AI isn't going to completely replace such thing, yes, but it'll also be the end of said cheap Indian call centers.

[–] [email protected] 7 points 1 year ago

Who also don’t have the information or data that I need.

load more comments (1 replies)
[–] [email protected] 8 points 1 year ago

I'd say at best it's an upgrade to scripted customer service. A lot of the scripted ones are slower than AI and often have stronger accented people making it more difficult for the customer to understand the script entry being read back to them, leading to more frustration.

If your problem falls outside the realm of the script, I just hope it recognises the script isn't solving the issue and redirects you to a human. Oftentimes I've noticed chatgpt not learning from the current conversation (if you ask it about this it will say that it does not do this). In this scenario it just regurgitates the same 3 scripts back to me when I tell it it's wrong. In my scenario this isn't so bad as I can just turn to a search engine but in a customer service scenario this would be extremely frustrating.

[–] [email protected] 7 points 1 year ago

Check out this recent paper that finds some evidence that LLMs aren't just stochastic parrots. They actually develop internal models of things.

load more comments (2 replies)
[–] [email protected] 60 points 1 year ago (1 children)

And the way customer support staff can be/is abused in the US is so dehumanizing. Nobody should have to go through that wrestling ring.

[–] [email protected] 53 points 1 year ago (1 children)

A lot of that abuse is because customer service has been gutted to the point that it is infuriating to a vast number of customers calling about what should be basic matters. Not that it's justified, it's just that is doesn't necessarily have to be such a draining job if not for the greed that puts them in that situation.

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago)

There was a recent episode of Ai no Idenshi an anime regarding such topics. The customer service episode was nuts and hits on these points so well.

It's a great show for anyone interested in fleshing some of the more mundane topics of ai out. I've read and watched a lot of scifi and it hit some novel stuff for me.

https://reddit.com/r/anime/s/0uSwOo9jBd

[–] [email protected] 19 points 1 year ago (3 children)

I’m pretty sure it’d be way nicer experience for the customers.

Lmfao, in what universe? As if trained humans reading off a script they're not allowed to deviate from isn't frustrating enough, imagine doing that with a bot that doesn't even understand what frustration is..

load more comments (3 replies)
[–] [email protected] 4 points 1 year ago

Yeah but are you ready for “my grandma used to tell me $10 off coupon codes as I fell asleep…”

load more comments (1 replies)
[–] [email protected] 47 points 1 year ago (3 children)

I've worked in this field for 25 years and don't think that ChatGPT by itself can handle most workloads, even if it's trained on them.

There are usually transactions which must be done and often ad hoc tasks which end up being the most important things because when things break, you aren't trained for them.

If you don't have a feedback loop to solve those issues, your whole business may just break without you knowing.

[–] [email protected] 24 points 1 year ago

I think you're talking about actual support, that knows their tools and can do things.

This article sound more about the generic outsourced call center that will never, ever get something useful done in any case.

[–] [email protected] 11 points 1 year ago (1 children)

I ordered Chipotle for delivery and I got the wrong order. I don't eat meat so it's not like I could just say whelp, I'm eating this chicken today I guess.

The only way to report an issue is to chat with their bot. And it is hell. I finally got a voucher for a free entree but what about the delivery fee and the tip back? Impossible.

I felt like Sisyphus.

I waited for the transaction to post and disputed the charge on my card and it credited me back.

There's so many if-and-or-else scenarios that no amount of scraping the world's libraries is AI today able to sort out these scenarios.

[–] [email protected] 4 points 1 year ago

Yes these kind of transactions really need to be hand coded to be handled well. LLM's are very poorly suited to this kind of thing (though I doubt you were dealing with an LLM at Chipotle just yet).

load more comments (1 replies)
[–] [email protected] 39 points 1 year ago (1 children)

Cheaper than outsourcing to poor countries with middling English speaking capability.

Coming to call center lines near you: voiced chatbots to replace the ineffective, useless customer support lines that exist today with the same useless outcomes for consumers but endless juggling back and forth without any real resolutions. Let's make customer service even shittier, again!

[–] [email protected] 18 points 1 year ago

If you bought the product we don't need to worry about losing money anymore bro

[–] [email protected] 38 points 1 year ago (3 children)

On one hand, they're crap jobs. On the other hand, in most economies we have crap jobs not because they're necessary for productivity, but to give us an excuse to pay people to live.

Maybe if enough jobs are lost to automation, we'll start to rethink the structure of a society that only allows people to live if they're useful to a rich person.

Essentially, we're just still doing feudalism with extra steps, and it's high time we cut that nonsense out.

[–] [email protected] 3 points 1 year ago (2 children)

I think once workers can be replaced, there will be some virus that wipes out most of humanity. No point keeping billions of people around if they aren't needed.

load more comments (2 replies)
load more comments (2 replies)
[–] [email protected] 34 points 1 year ago (1 children)

What if we get it to agree to give us stuff for free? Is it a representative of the company or not?

[–] [email protected] 12 points 1 year ago (1 children)

You also have to have a reasonable belief the company representative is authorized to do whatever they're doing to be entitled to it.

load more comments (1 replies)
[–] [email protected] 29 points 1 year ago (2 children)

I see two inevitable problems;

  1. we outsourced this to you because it was cheaper, if you're using ChatGPT what do we need you for?

  2. companies want people to buy stuff, but if you significantly reduce the workforce you also reduce the availability of funds to buy stuff

[–] [email protected] 12 points 1 year ago

1, I assume you mean the business does outsourced customer service, not as an internal department.

2, universal basic income time, or let's put people to work on creative, innovative applications not mind numbing shit

load more comments (1 replies)
[–] [email protected] 28 points 1 year ago (1 children)

Seems like a good way to get the "agent" to agree it's in the wrong, and get 100% refund

[–] [email protected] 17 points 1 year ago (1 children)

I'm interested in if the AI agent has the power to disseminate refunds or at least return authorizations.

One of the things fascinating to me is that some of the problems humans are bad at handling (such as social engineering) AI tends to be even worse at.

load more comments (1 replies)
[–] [email protected] 27 points 1 year ago (3 children)

Remember when AI was going to make life better for everyone?

Yeah. That shit’ll be the end of us.

[–] [email protected] 20 points 1 year ago

AI will make the life better for the shareholders

[–] [email protected] 14 points 1 year ago (1 children)

Hopefully it'll be the end of capitalism. How is the economical model supposed to function when nobody is working? Where are people supposed to get money from? How is anything going to be taxed?

Realistically though it'll somehow push capitalism into hyperdrive and enslave the global population under the control of the AI owners.

load more comments (1 replies)
load more comments (1 replies)
[–] [email protected] 22 points 1 year ago (1 children)

A lot of jobs are just busy work that does nothing and makes nothing. Talking about automating them misses the point of why the jobs exists in the first place.

"I see that you are throwing a ball at a target that is connected to a platform with a human sitting above a tank of water. Here is a AI generated picture of a random human underwater to sate your needs. Ya! I have made this process 200% more efficient!"

[–] [email protected] 13 points 1 year ago* (last edited 1 year ago) (2 children)

It's crazy how people seem fundamentally incapable of looking at the big picture and ask themselves things like, "what even is the purpose of society? Is this the best society humanity is able to come up with? What if I am not ready to accept society as it is presented to me, what are my alternatives, do I even have any? What are my obligations towards a society that marginalizes me and treats me like a second or third tier human, without any hope of ever improving my lot?"

Ask people if they would rather be free and get everything they want without having to work for it. The answers you'll get will boggle your mind.

[–] [email protected] 6 points 1 year ago

We've been permeated by the idea that "you have to be financially productive to be a decent human" for so long, even people against excessive/useless work still sometimes miss the point of this crazy race toward making more benefit regardless of anything else.

Sometimes, reaching the "it works" point is enough, but higher ups never stops there. It always have to be "better/more".

[–] [email protected] 4 points 1 year ago

I'm surprised by the number of workaholics that exist, like why do you want to work so much? Go explore the world, learn things, make things, but people want to work instead?

[–] [email protected] 20 points 1 year ago (1 children)

You still need to employ some humans as a backup when the AI catastrophically fucks up, but for the most part it makes sense. Not all jobs need to continue to exist.

[–] [email protected] 10 points 1 year ago

Exactly. As the article ends:

Not every customer service employee should worry about being replaced, but those who simply copy and paste responses are no longer safe, according to Shah.

“That job is gone,” he said. “100 per cent.”

[–] [email protected] 16 points 1 year ago
[–] [email protected] 5 points 1 year ago

Working conditions in this industry are not great. The turnover rate can reach 80% sometimes. It can be a difficult, stressful and low paid job that few people enjoy. At the same time, the demand for this work keeps increasing as more and more of consumer activity shifts online and remote. It seems to me that the technology may be a net benefit in this case. The public and its regulatory authority should, however, keep a close eye on developments to make sure humans are not left behind.

load more comments
view more: next ›