this post was submitted on 27 Jan 2025
654 points (97.7% liked)

Technology

35483 readers
372 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 18 points 2 days ago (2 children)

Idiotic market reaction. Buy the dip, if that's your thing? But this is all disgusting, day trading and chasing news like fucking vultures

[–] [email protected] 10 points 2 days ago* (last edited 2 days ago) (3 children)

Yep. It's obviously a bubble, but one that won't pop from just this, the motive is replacing millions of employees with automation, and the bubble will pop when it's clear that won't happen, or when the technology is mature enough that we stop expecting rapid improvement in capabilities.

[–] [email protected] 3 points 2 days ago

I love the fact that the same executives who obsess over return to office because WFH ruins their socialization and sexual harassment opportunities think think they're going to be able to replace all their employees with AI. My brother in Christ. You have already made it clear that you care more about work being your own social club than you do actual output or profitability. You are NOT going to embrace AI. You can't force an AI to have sex with you in exchange for keeping its job, and that's the only trick you know!

[–] [email protected] 2 points 2 days ago

usly a bubble, but one that won’t pop from just this, the motive is replacing millions of employees with automation, and the bubble will pop when it’s clear that won’t happen, or when the technology is mature enough that we stop expecting rapid improvement in capabilities.

[–] [email protected] 1 points 2 days ago (2 children)

Well both of those things have been true months if not years, so if those are the conditions for a pop then they are met.

[–] [email protected] 2 points 2 days ago (1 children)

How are both conditions meer when all this just started 2(?) years ago? And progress is still going very fast.

[–] [email protected] 1 points 2 days ago (1 children)

all this started in 2023? alas no time marches on, llm have been a thing for decades and the main boom happened more in 2021. progress is not fast, no, these are companies throwing as much compute at their problems as they can. deepseek's caused a 2t drop by being marginal progress in a field (llms specifically) out of ideas.

[–] [email protected] 1 points 2 days ago (1 children)

The huge AI LLM boom/bubble started after chatGPT came out.

But of fucking course it existed before.

[–] [email protected] 1 points 2 days ago

regardless of where you want to define the starting point of the boom, it's been clear for months up to years depending on who you ask that they are plateuing. and harshly. stop listening to hypesters and people with a financial interest in llm being magic.

[–] [email protected] 2 points 2 days ago* (last edited 2 days ago) (1 children)

It's gambling. The potential payoff is still huge for whoever gets there first. Short term anyway. They won't be laughing so hard when they fire everyone and learn there's nobody left to buy anything.

[–] [email protected] 1 points 2 days ago (1 children)
[–] [email protected] 1 points 2 days ago (1 children)

Get to the point of replacing a category of employee with automation.

[–] [email protected] 1 points 2 days ago (1 children)

Oh! Hahahaha. No.

the vc techfeudalist wet dreams of llm replacing humans are dead, they just want to milk the illusion as long as they can.

[–] [email protected] 1 points 1 day ago* (last edited 1 day ago) (1 children)

The tech is already good enough that any call center employees should be looking for other work. That one is just waiting on the company-specific implementations. In twenty years, calling a major company's customer service and having any escalation path that involves a human will be as rare as finding a human elevator operator today.

[–] [email protected] 1 points 1 day ago (1 children)

the tech is barely good enough that it is vaguely maybe feasibly cheaper to waste someone's time using a robot rather than a human- oh wait we do that already with other tech.

"in 20 years imagine how good it'll be!" alas, no, it scales logarithmically at best and all discussion is poisoned by "what it might be!" in the future, rather than what it is.

[–] [email protected] 1 points 1 day ago* (last edited 1 day ago) (1 children)

It's not necessary to improve the quality to make this happen, only to train it to work with that company's products and issues, and integrate it into whatever other systems that may be needed. Just need enough call logs for training data, and that's already something that's collected.

[–] [email protected] 1 points 1 day ago (1 children)

except current robot systems and people are likely cheaper, especially when you consider companies are liable for what llm say. which leaves, essentially, scams and other slop, as the last remaining use cases. multi trillion dollar business without a use case.

[–] [email protected] 1 points 1 day ago (1 children)

The money saved on wages would cover a LOT of liability. And most people that have a case don't pursue it anyway.

[–] [email protected] 1 points 1 day ago

what money saved on wages?? it's competing with a dollar a day laborers. $10 per 1 million tokens, for the "bad" (they all suck) models (something that cant even do this job!). if you can pretend the hallucinations dont matter, you are getting a phone call for (4 letters per token, 6 minute avg support call, 135 wpm talking rate let's say 120 to be nice -> 720 tokens per call) = $0.0072 per call. the average call center employee handles around 40 calls a day, so hey, the bad cant-actually-do-it chatgpt 4 is 70 cents per day cheaper than your typical call center indian!

Except. that is the massively subsidized money hemorrhaging rate. We know that oai should be charging probably an oom or two more. and the newer models are vastly more expensive, o1 takes around 100x the compute, and still couldnt be a call center employee. so that price is actually at least $30 per day. Cheaper than a us employee, but still cant actually do the job anyway.

[–] [email protected] 2 points 2 days ago

Yeah, after what happened, I now understand how irrational stock market is.