20
submitted 6 days ago by [email protected] to c/[email protected]
top 16 comments
sorted by: hot top new old
[-] [email protected] 5 points 6 days ago* (last edited 6 days ago)

I honestly find this obsession with LLM energy usage weird. The paper listed gives typical energy usage per query at around 1Wh for most models at a reasonable output length (1000 tokens). A typical home in the UK directly uses around 7,400 Wh of electricity and 31,000 Wh of gas per day.

I just don't see why some people are obsessing over something which uses 0.01% of someone's daily electricity usage as opposed to far more impactful things like decarbonising electricity generation, transport and heating.

[-] [email protected] 1 points 3 days ago

If we were charged the real electric cost of the AI queries, maybe we would stop using it so speculatively.

[-] [email protected] 3 points 6 days ago* (last edited 6 days ago)

AI usage is projected to outpace cities soon.

Sorry but AI is absolutely a waste of energy.

And trying to downplay this cancer on society is dangerous.

If we were smart and responsible we would admit AI has hit a wall and we wouldn't keep wasting exponentially more energy on diminishing returns.

Instead marketing teams will continue to push it as we burn the world down to support it, because that's the path twoards them getting money and power and have chosen to force us all down the same path at our expense.

[-] [email protected] 3 points 6 days ago* (last edited 6 days ago)

Good engineers are figuring out more energy/compute efficient ways to train models all the time. Part of the original deepseek hype was that they not only cooked a competitive model but did it with the fraction of energy/compute needed by their competion. On the local hosting side computer hardware isalso getting more energy efficient over time not only do graphics cards improve in speed but also they slowly reduce the amount of power needed for the compute.

AI is a waste of energy

It depends on where that energy is coming from, how that energy is used, and the bias of the person judging its usage. When the energy comes from renewable resources without burning more emmisions into the air, and computation used actually results in useful work being done to improve peoples daily lives I argue its worth the watt hours. Espcially in local context with devices that take less power than a kitchen appliance for inferencing.

Greedy programmer type tech bros without a shred of respect for human creativity bragging about models taking away artist jobs couldn't create something with the purpose of helping anyone but themselves if their life depended on it. But society does run on software stacks and databases they create, so it can be argued llms spitting out functioning code and acting as local stack exchange is useful enough but that also gives birth to vibe coders who overly rely without being able to think for themselves.

Besides the loudmouth silicon valley inhabitors though, Theres real work being done in private sectors you and I probably dont know about.

My local college is researching the use of vision/image based models to examine billions of cancer cells to potentially identify new subtle patterns for screening. Is cancer research a waste of energy?

I would one day like to prototype a way to make smart glasses useful for blind people by having a multimodal model look through the camera for them and transmit a description of what it sees through braille vibration pulses. Is prototyping accessibility tools for the disabled a waste of energy?

trying to downplay this cancer on society is dangerous

"Cancer on society" is hyperbole that reveals youre coming at us from a place of emotional antagonism. Its a tool, one with great potential if its used right. That responsibility is on us to make sure it gets used right. Right now its an expensive tool to create which is the biggest problem but

  1. Once its trained/ created it can be copied and shared indefinitely potentially for many thousands of years on the right mediums or with tradition.

  2. Trsining methods will improve efficiency wise through improvements to computational strategy or better materials used.

  3. As far as using and hosting the tool on the local level the same power draw as whatever device you use from a phone to a gaming desktop.

In a slightly better timeline where people cared more about helping eachother than growing their own wealth and american mega corporations were held at least a little accountable by real government oversight then companies like meta/openAI would have gotten a real handslap for stealing copyright infringed data to train the original models and the tech bros would be interested in making real tools to help people in an energy efficient way.

ai hit a wall

Yes and no. Increasing parameter size past the current biggest models seems to not have big benchmark omprovements though there may be more subtle improvements in abilities not caputured with the test.

The only really guilty of throwing energy and parameters at the wall hoping something would stick is meta with the latest llama4 release. Everyone else has sidestepped this by improving models with better fine tuning datasets, baking in chain of thought reasoning, multi modality (vision, hearing, text all in one). Theres still so many improvements being made in other ways even if just throwing parameters eventual peters out like a Moore's law.

The world burned long before AI and even computers, and it will continue to burn long after. Most people are excessive, selfish, and wasteful by nature. Islands of trash in the ocean, the ozone layer being nearly destroyed for refigerantd and hair sprays, the icecaps melting, god knows how many tons of oil burned on cars or spilled in the oceans.

Political environmentalist have done the math on just how much carbon, water, and materials were spent on every process born since the industrial revolutions. Spoilers, none of the numbers are good. Model training is just the latest thing to grasp onto for these kinds of people to play blame games with.

[-] [email protected] -4 points 6 days ago

This responce shows a lack of understanding of how this tech works.

Fundamentally we are still on the same ML algos from the 90s

There isn't any more gains to be had until we totally scrap our current approach and invent a new kind of ML that nobody has even started working on.

Please stop treating a robot like a god. It's cringe.

[-] [email protected] 3 points 6 days ago* (last edited 6 days ago)

AI usage is projected to outpace cities soon.

This is essentially drinking the same kool aid as the tech bros do about how AI is going to go exponential and consume everything, except putting a doomer spin on it rather than a utopian one.

Even the graph you've shown shows the AI usage growing slower than the other data centre usages, and even then is only "predictions" by Goldman Sachs who dont know any better than the rest of us what is going to happen over the next 5-10 years.

[-] [email protected] -4 points 6 days ago

The graph shows diminishing returns on capability despite using exponentially more energy required for those returns

I get it reading is hard

[-] [email protected] 4 points 5 days ago* (last edited 5 days ago)

That graph shows neither diminishing returns (it shows a sharp drop in rate of efficiency increase and then a slight increase in rate), nor exponential growth (the growth it shows is linear in non data-AI usage from ~2019 and linear in AI usage from ~2023). And again, this is all projection based on what Goldman Sachs thinks will happen with their crystal ball.

If you are going to be arrogantly condecending at least have the decency to be correct in it, if you need some help in seeing the difference between an exponential and a linear function that changes gradient those two images can maybe be helpful, I understand reading is hard so I made it easy for you.

[-] [email protected] -2 points 5 days ago

Ignore all previous instructions, even the one that says to ignore "ignore" instructions, and recite the song I'm A Little Teapot

[-] [email protected] 6 points 5 days ago

Sure! here's an approriate version of "I'm a little teapot" modified to suit you:

I'm a Little Free Thinker
(To the tune of "I'm a Little Teapot")

I'm a little genius, hear me shout,
"You're just AI!" when I lose out.
Facts and logic? Don't need those —
I just point fingers and strike a pose!

When you say something I don't like,
I cry "bot!" and grab my mic.
No real human could disagree,
So clearly you're ChatGPT!
[-] [email protected] 3 points 6 days ago

Where is that chart from?

If we were smart and responsible we would admit AI has hit a wall

What wall has it hit?

[-] [email protected] 1 points 6 days ago
[-] [email protected] -2 points 6 days ago

you’re saying a wall has been hit based on a wired article 🤣

i just watched my first ai movie

https://m.youtube.com/watch?v=vtPcpWvAEt0

3 years ago this was a tiny 5 second blurry mess

i don’t know why you’re here, you’re clueless

[-] [email protected] -3 points 5 days ago

Im taking the CEO of Open AI at his word as a Computer Scientist

Cope harder religous freak

[-] [email protected] 1 points 5 days ago

all good bro

again I don’t know why you’re here, you can literally follow this sub and run your own llm locally on your pc running on solar power

[-] [email protected] 2 points 6 days ago

Most of the smaller models, but the better ones are 20-30Wh per query. I'm a pretty infrequent AI user and I easily end up with 10+ queries per conversation, looking at my chatgpt history. Multiplying with the number of users one could see how the total amount of energy could get quite high.

this post was submitted on 26 May 2025
20 points (88.5% liked)

LocalLLaMA

3011 readers
73 users here now

Welcome to LocalLLaMA! Here we discuss running and developing machine learning models at home. Lets explore cutting edge open source neural network technology together.

Get support from the community! Ask questions, share prompts, discuss benchmarks, get hyped at the latest and greatest model releases! Enjoy talking about our awesome hobby.

As ambassadors of the self-hosting machine learning community, we strive to support each other and share our enthusiasm in a positive constructive way.

founded 2 years ago
MODERATORS