139
submitted 6 days ago* (last edited 6 days ago) by ExtremeDullard@piefed.social to c/fuck_ai@lemmy.world

And it's not what you think.

Beyond everything that's wrong with AI as it's currently deployed - i.e. the fact that's basically owned and run by nefarious surveillance capitalists in bed with fascists, and it's mostly used to replace honest people and things with cheaper mediocrity - what's wrong with the technology for me finally clicked in my head today.

My first - indirect - encounter with AI was a few years ago - quite a few years in fact, come to think of it. I went to a forum for mechanical keyboard enthusiasts and asked whether someone remembered some obscure folding full-size keyboard I used to own back in the 90s. I lost that keyboard years ago and I'd been trying to identify which brand / model it was for years. I figured if anybody knew what it was, it would be one of that forum's dwellers.

So I described the keyboard as best I could, and someone immediately came up with the correct device. Wow! I had been looking for so long, and just like that, this man gave me the answer! I thanked him profusely and told him he had a good memory. "Nah" he said, "I just asked AI."

That instantly deflated me. This guy really hadn't done anything other than forward my question, and that elusive keyboard of mine was one question away in some search engine. I should have been excited to finally find out what it was, but somehow instead it felt hollow and totally pointless all of the sudden.

Fast forward a few years, the same thing happened with my builder the other day: I asked him if he could build a boardwalk in the backyard, so my disabled wife could go get some fresh air safely. I started explaining what I wanted and sketching things on a piece of paper. At some point, he simply got on his feet, whipped out his cellphone, shot a picture of the backyard and asked ChatGPT to draw a boardwalk to my specifications. And in 2 seconds flat, it came up with a photo of the finished thing. And for the second time, it felt totally hollow, and the whole project felt meaningless. It was what I wanted for sure, but it's now how I wanted it - if that makes any sense.

And today at work, I was playing with a model on a powerful server we just bought, to evaluate how to use AI locally for coding purposes, and at least avoid running cloud services from fascist America. And yeah, it works: it produces code that, if not very good, is plenty good enough if I'm very careful, particularly for the amount of time it takes to spew out the code.

And it felt completely, utterly hollow.

And then it hit me: my job as a traditional software engineer is coming to an end. I'm very senior, so I'll be among the last ones the chopping block, but I'll get the chop alright. And it's okay: I've been obsoleted. It's fine. It's progress. It came quicker than I anticipated, but fundamentally I have no issue with that.

But here's what's bothering me: there's no effort needed into anything anymore. You get what you want with zero effort and zero sense of accomplishment. So what's the point of anything really?

I'm a lazy SOB, so I love the idea of getting shit done without lifting a finger. But somehow everything has become so effortless that it leaves me empty. Kind of like asking a computer to solve a crossword: it'll be done in less than a second and it'll be super efficient, but it makes the crossword pointless.

For instance, I'm a fan of 360 photography. I take pride in reworking the nadir in all my shots (don't ask...) AI could clean it up in seconds and probably do a better job than me with Gimp. But what's the point? I don't want to do that! It's tempting, but then I'd be totally disinterested in the photo after AI is done cleaning it up. So I do it myself.

Likewise, if I have a technical problem, I'll look for the answer the traditional way, with a search engine. I know I could probably ask Google's AI thing and it would probably give me the right answer rightaway, but then what would I have learnt? And more importantly, what's the point of learning anything if the answer is always there? So I refrain. And yet it's tempting...

AI makes everything pointless and bland, and it leaves me empty and not wanting for more.

The other thing is this: I know people say our jobs won't disappear, they'll change: we won't code anymore, we'll direct a machine that codes for us. We'll apply our experience to guide a machine that will do the grunt work for us. It's like we'll all have the equivalent of a very dumb university student with the entire world's knowledge at their disposal, pissing code for us and talking too much, but never complaining.

But you know what? I'm not interested in doing that. It's fucking boring to me - the same way I would find my professional life completely boring if I had an actual university student code monkey I could legally give orders to. I simply have no interest in doing that.

I do marvel at the technology that underpins AI, and what it can do is fascinating - even in its current, completely fucked up and dumb-as-a-brick state. I've done AI work in the 80s, so I appreciate how far we have come, and I never thought I'd see anything like that in my lifetime. And I totally get that people are excited by the technology. But me? I find the whole thing intensely uninspiring. Impressive, yes, but boring to tears.

Just like the proverbial horse carriage driver, I've been obsoleted by cars. And again, it's fine: it's progress. I get it. And yes, as a good, experienced horse carriage driver, I could probably recycle my career and become a decent car driver for the few years I have left on the job. But I'm simply not interested. If the horses are gone, I'm plain and simply bored out of my mind, however good the cars are.

That's my beef with AI. That's what I realized today: it's a bright future I'm utterly indifferent to, that holds zero excitement for me.

And quite frankly, considering the tidal wave of it that's hitting the entire world right now, it left me quite depressed for the rest of the day. So I hit the bottle tonight. It's one of those days I guess...

top 50 comments
sorted by: hot top new old
[-] rosco385@lemmy.wtf 27 points 6 days ago

The problem with AI is that it allows the wealthy to access skill, while not allowing the skilled to access wealth.

It's just another way for the rich and powerful to control access to the entire world. If we continue down this path the end result will be a world like that depicted in Elysium starring Matt Damon.

[-] ExtremeDullard@piefed.social 17 points 6 days ago

The problem with AI is that it allows the wealthy to access skill, while not allowing the skilled to access wealth.

I'm stealing that quote.

[-] rosco385@lemmy.wtf 10 points 6 days ago

Go for it, I stole it from someone else. 😂

[-] snoons@lemmy.ca 24 points 6 days ago

I recommend reading Look to Windward by Iain M. Banks. One of the themes overarching the whole story (or at least part of it?) is basically your post:

A famous composer is creating a concert for an important event. The "AI" (AKA a mind that would be offended to be called an AI) that controls the habitat he lives in could effortlessly compose a similar piece of music in seconds that would be technically more impressive; however, the audience doesn't just want to listen to music, they want connection too. When they listen the the famous composers music, they are also connecting with him and the hours he spent everyday for months and years creating it. The Mind, with all it's capacity and unknowable technology, cannot hope to replicate that.

That hollow feeling is the lost connection of solving a problem with another human. It's the same as asking someone for directions and they pull out their phone without bothering to actually think about it. When you ask someone if they know something and they "search it up for you". It's the death of community.

I started reading the Culture novels recently and have been quite enjoying them. Even though technosocialism and intelligent machine utopianism havent really appealed to me IRL, i think the way he uses them as literary devices to highlight the human (or pan-human/post-human as the case may be) stories is actually really brilliant. His prose is also very highly regarded for it's beauty, but that takes the back seat compared to the stories for me.

[-] snoons@lemmy.ca 2 points 6 days ago

Somewhat the same. Humans in the Culture are akin to pets for minds, a bit like how we take care of dogs and cats. Doesn't entirely sit right with me, but the stories and message are sublime.

I find it funny, in that context, how the human characters in SC almost universally deride the minds and drones as machines, scap heaps, buckets of bolts, etc at some point. Like, they recognize fully that their way of life is 100% dependant on the minds, but in no way does that mean they deserve respect the way a human would. In the same chapter you'll have a description of a drone's personallity or how they do something very human like keep secrets for fun or whatever, then you'll get the human character outright call the drone an annoying pile of gears or describe how it's social suicide to have a drone follow you around all the time.

It's at the same time a very different context than what we know of as AI, while also mirroring how many people feel about AI today. I've only read the first few culture books, but it hits that wonderful mix of fantastical SciFi and at the same time very relatable on the human element.

[-] Today@lemmy.world 22 points 6 days ago

Totally get it!

We are remodeling a house and my husband keeps showing me AI'd options - different countertops, etc. I hate it. I gave in and had gemini draw a layout for a shed that we want to build and i feel sickening guilty for it.

There's too much emphasis on the intelligence; not enough on the artificial.

[-] cecilkorik@piefed.ca 17 points 6 days ago

I've been thinking a lot about this too, and I've started thinking of it in terms of quality vs quantity, and how that relates to industrialization, commercialization and productivity vs handcrafting, art and joy, and how all these things are tied together into the things we value and the things we don't value.

AI, if it lives up to its promises (which I think it won't in the long run, but for the sake of argument let's assume it does), may produce many things in extraordinary quantity and sufficient quality to be commercially successful. This will likely become the standard way of acquiring most things we don't care deeply about, just like cheap plastic crap replaced wood and metal toys. However not all cheap plastic crap is actually crap. Some of it is actually good. Also plenty of wood and metal crap was made too. Some of it was very valuable. A lot of it was not. The value was dictated by the rarity and the complexity, but the value also represented how much we appreciated it despite or because of its quality.

There will always be things we do care deeply about even when it doesn't make sense to, and it frees us to make more conscious decisions about the the things we decide to care deeply about. It will not stop people from caring deeply about certain things, it will not stop enthusiasts from demanding better quality. The specific things people care about will almost inevitably change, becoming narrower in some cases and broader in others, or shifting directions entirely. Some things we used to have to care about the quality of, may no longer be so important to us. But with every closing door another one opens, we will find something new we very much do care about the quality of, and it may even be something better and more meaningful.

If you're not following my pretty abstract ideas let me try and make it more specific. Timekeeping used to be an incredibly complex and labor intensive task, an institution like a church would have an entire staff with various tools used to keep track constantly, and then ring bells to let everyone else what time it is. That work eventually became a machine, but still requiring significant effort to construct and maintain. Towns would build one clock tower to serve the entire town population at once. Only very rich people could afford their own. Each one was, large or small, by modern standards a hand-crafted work of art. Gradually, that changed. They became more common, cheaper in cost, and cheaper in quality too, but they were still significant pieces in a household, designed to be looking nice and placed in prime locations. Then they got replaced by more portable pocketwatches, and then wristwatches, and then digital alarm clocks and digital sports watches, each one cheaper and junkier than the last iteration and eventually now we're at the point most people just glance at the screen on their omnipresent smartphone that does a million different things all at once, few of those things being anything that anyone actually cares deeply about.

Lest you think this implies a depressing inevitability of decay and obsolesence that AI poses, yes I think there is potentially some of that ahead, and people often generalize that as "dying" and "loss". But if you continue the clocks analogy, there are still people who value clocks, and even make clocks, in all their sizes and forms. As antiques or replicas, as works of art, as pieces of nostalgia, as history, as things of beauty and practicality, as things of style and culture. From the outside it's easy to look at what's left of the clock industry as something small and sad and dying, but from inside it doesn't feel like that at all. It feels like joy and respect and appreciation and genuine enthusiasm. That is what we should be pursuing. The experts in their craft still have jobs, they might even enjoy their jobs more, crafting beautiful and skilled works of art to be sold to genuine enthusiasts that appreciate and value them instead of pumping out quantity and more quantity. There are lot of fewer of them, certainly, the experts and the enthusiasts and the clocks themselves, compared to the ones who used to be on every main street or high street and the market that used to include everyone and the clocks that used to be in every home and every pocket and every wrist. To capitalism, that means it's "dying", not worth investing in, belongs in the dustbin of history. To the people who choose to value it though, they don't (and shouldn't) give a toss what capitalism thinks. We value things as art because they are meaningful to us and they bring us joy.

A more modern and directly relevant example is video games. Gamers have, by and large, rejected all AI slop they can identify. This is partly framed as a moral choice yes, but it's also a practical one. AI games aren't likely to be as fun or valuable, and people are generally quite particular about the quality of the games they play, at least the ones without manipulative marketing, monetization and social aspects which is a whole separate can of worms. But no AI is ever going to create Baldur's Gate 3 (or 1 or 2 for that matter). People enjoy it for its quality, for the thought and effort that went into it. Even if AI did create the same thing, simply knowing that it was AI or that AI was used would devalue it. Because the value of such things is not purely utilitarian. Morals and quality do apply very significant premiums to value, especially in certain areas like entertainment.

When it comes to things like entertainment, and really everything, the things we choose to enjoy, we actually appreciate them for the effort involved, not the end result. The end result is merely a symbol of the effort. We appreciate our child's dance recital not because it rivals a broadway performance but because it is meaningful to us and we recognize our child's effort. And we likewise appreciate a broadway performance because of our ability to acknowledge the effort the performers and stage managers and lighting technicians and everyone else took to get it to that level. AI doesn't spark the same recognition of effort, because there is no effort. It is productive, but without effort it is not meaningful. It will not replace meaningful things, it will only replace things that are already pretty meaningless, that we are only still appreciating because we still recognize and appreciate the effort involved. When there is no more effort involved, we will not need to continue appreciating them, and many people will focus on appreciating more important things instead, and I want to believe that will include the education and the arts and the culture that actually matters to develop ourselves and our society and bring value our future, instead of the technology and tools that simply exist to make those things happen.

If you think high-quality software is something people won't continue to value far into the future, AI or not, then I think you'll be pleasantly surprised. You might have to pivot a little. You might have to learn how to take advantage of some AI to help with more of the scaffolding and templating and refactoring than you are used to doing, or you might have to move to a more discerning product category like entertainment software where AI is explicitly rejected, but I am confident that your expertise will probably remain quite valuable somewhere as long as you can remain flexible enough to find a new niche that fits your goals.

Change is scary, but it's not all bad. It's just change. The change is almost certainly going to happen, but it will be as good or as bad as we decide to make it. We're not doing a good job so far, but not all hope is lost. We all need to start to think about what we really value in life, and why, and the answer to what AI actually replaces will come from those collective decisions.

[-] kek@discuss.tchncs.de 5 points 6 days ago

Lovely take, thanks for sharing.

[-] logi@lemmy.world 11 points 6 days ago

But you know what? I'm not interested in doing that. It's fucking boring to me - the same way I would find my professional life completely boring if I had an actual university student code monkey I could legally give orders to. I simply have no interest in doing that.

That is my exact sentiment. That's not a job I'm interested in. I'd have become management by now if that held any interest.

Also, at least so far, the LLMs really like to write very subtly wrong, weirdly over-complicated code in large quantities. So now you're wading through all that looking for bugs and wondering whether to (make the LLM) simplify it down to something that can be maintained (by the LLM).

It sounds dreary. Is it too late to switch to geophysics? They get to do fieldwork.

[-] Tore@piefed.world 10 points 6 days ago

I'm an Art Lead for a major game publisher. I've been asked to review AI Tools and investigate how we could integrate them onto our pipeline.

I basically felt the same thing you described. I did my job but it felt completely empty. No sense of accomplishment. It was as if I had a mindless coworker. Finishing the project felt like eating empty calories. I was stuffed but malnourished.

[-] gdog05@lemmy.world 13 points 6 days ago

There is value in the effort and challenge. There was recently a YouTube video about this, I forget by whom, that made the point really well. I'm as big into tech as anyone I feel. I love automating my home. I love the convenience of lights just coming on when you need them or fans or AC running exactly when the temperature dictates for efficiency.

I started self hosting and I stream my own music to everything. But I have found that my older Ford Focus having just the right song on my extremely large but definitely limited USB drive is so much more gratifying than streaming. Even over my own streams. I get why people are going back to CDs and cassettes. Well, kind of.

AI is also making the wrong things easy. I'm sure everyone has seen the meme about AI writing and making art when you want something to clean so you have time to make art. That's really it. We need some form of struggle to make us creative and skillful. Not to mention, the way the technology jumped onto the scene has prevented the more creative people from finding the best use cases for the technology as we go along. The purveyors of AI are just the most ambitious (greedy) not the most creative or talented. So what could be good implementation is making slow progress by the people creatively applying it. Meanwhile, an ambitious person is using AI to improve AI so they can corner pieces of the market which cuts off that real-world creativity.

[-] haverholm@kbin.earth 8 points 6 days ago

I completely agree with all your points. Well, I've been going back to analog music, vinyl and CDs, and I don't think the timing is coincidental.

We are hardwired to come up with solutions to things; it satisfies us immensely because we get an endorphin rush. We don't get that from prompting a bot* to do what we used to do. Some people may get it from having the "AI" make things they couldn't on their own — but I bet they aren't capable of estimating the quality of the generated work in comparison to something manually created.

There are two things missing with "AI": the sense of craft on the maker's end, and an understanding of the same craft on the user's (or prompter's) end. In the end it's a lack of a human touch.

* had to go back and edit this from "not", because the one thing we could use "AI" for, spell check, is apparently the one thing that isn't being upgraded.

[-] highrfrequenc@lemmy.world 6 points 6 days ago

There was a stand up bit I remember from many years ago, before AI was a thing. It was about how having a search engine in our pockets has ruined human experiences. If you ask a friend a question it's like, I don't know, but I will in five seconds.

It used to be if you didn't know something... You just didn't know. Like listening to music and suddenly thinking... where is Tom Petty from? And you would ask your friends and they didn't know. And you'd go through life just occasionally wondering. And years later one day you would see a stranger walking up the street in a Heartbreakers T-shirt. And you would introduce yourself and ask where is Tom Petty from? And they would say Florida. And after so many years of wondering , a wave of emotions would come over you. And you ended up marrying that person.

That's what smart phones have taken from us.

It was a very funny bit and I feel like AI is that, taken to extremes.

[-] Ansis100@lemmy.world 2 points 5 days ago

I've actually made it a point during conversations to not google these types of questions, and asked others to do so too. If we're having a conversation, I'd much rather discuss or speculate on the topic than just straight up Google it.

It sounds weird - let's just talk about it rather than finding out the exact, precise answer - but that's what makes a conversation fun.

[-] AngryRedHerring@lemmy.world 4 points 5 days ago

AI can't build that deck, though. And it can't have a beer on it when it's built.

[-] jacksilver@lemmy.world 3 points 5 days ago

I also don't think it would be able to generate a valid architectural diagram. Like a mock up is one thing, but that doesn't mean it's actually viable or a good design.

[-] Longylonglong@sh.itjust.works 2 points 5 days ago

Human optimization and progress will lead us to AI powered robots doing that.

[-] AngryRedHerring@lemmy.world 1 points 5 days ago

don't you threaten me

[-] pro3757@programming.dev 8 points 6 days ago

It's very similar to what I've been feeling.

Casey Muratori said the same thing in a video some time ago (paraphrasing) -
I don't like using AI not because I hate the concept, but I don't want to remove the part that I want to do.
Some people get joy in looking at their ideas come to life regardless of the process. They are the ones excited about AI.
Programming for me is not about getting the end result, but the process of solving the problem. And people who like this part are the ones who don't like it that much. If I like playing drums, I'd want to play the drums and not get a drumming machine that may play 10x better than me.

Tangential question - Is it possible to feel some way for a period of time and realize what it is (or close to what it is) after it has been articulated by someone?

Or better yet, can the brain, with its subconscious bias, trick you into believing like you've been thinking the same thing after someone has said something that broadens your horizons?

I never realised this until I saw the video, and then felt guilty that I may be "stealing" the reason from someone because I liked what he said. Video link if anyone's interested - https://youtu.be/suZ2Gt6i8do .

[-] GreenKnight23@lemmy.world 6 points 6 days ago

the fact that's basically owned and run by nefarious surveillance capitalists in bed with fascists,

welcome to the team!

AI is fascism.

[-] northface@lemmy.ml 8 points 6 days ago

Thank you. I've never read a better description of the feelings I've had the past few years with LLM "booming".

[-] foodandart@lemmy.zip 8 points 6 days ago* (last edited 6 days ago)

So what's the point of anything really?

Once the challenge of life is gone, so will be the drive to push ones self. Ennui is poisonous.

This is why I refuse to drive an automatic transmission car: There are things that should be a bit difficult in one's life. They keep one invested in growing and meeting the challenges of the day..

Also, having seen videos of 11 and 12 year old kids driving stolen cars, uhhh.. yeah. No. Fuck that shit. I want NO part of anything that infantilizes one's skills and diminishes one's experience..

When you can learn to use a clutch you are in a vehicle that most children can't operate. Perfect.

I view AI as a similar infantilizing device. Thanks, no.. have a good day, I'll work this out on my own and have a legit sense of accomplishment when I'm done.

[-] NocturnalMorning@lemmy.world 1 points 6 days ago

The comparison to a manual vehicle is kind of odd. But, i get the sentiment.

[-] foodandart@lemmy.zip 2 points 6 days ago* (last edited 6 days ago)

you are correct, it is a bit odd, but it comes down to the same thing: Being in control of one's device and using it to the best of your ability.

Plus, there's too much erroneous data in AI. I dabble with classic Mac systems and have tons of the old PPC and 68k code and at one point was looking for the last available copy of Winamp for Mac.. and the AI results kept telling me no such thing existed, AS I was staring at it running in a VM on my system.

I finally did find the last version available but had to go to several different sites to find it.

NGL, tenbluelinks.org has been a godsend to get the AI results muzzled on my searches.

[-] technocrit@lemmy.dbzer0.com 4 points 5 days ago* (last edited 5 days ago)

The actual problems with "AI" are like grifting, pseudo-science, war, surveillance/spying, prison, environmental degradation, extreme inequality, etc.

If your beef is that "AI" sucks your life for the profit of others, that's all of technology under capitalism.

[-] Trainguyrom@reddthat.com 5 points 6 days ago

Setting aside the technical discussion, one of the challenges you've expressed is how tools leveraging AI can do passable work without the one building it needing to do anything.

The fact is, humans are wired to work for that dopamine hit of "I want to do this" "I'm now working on this" "I've finally done this!" and if you skip that middle step and don't put enough work in you quite frankly end up feeling very deflated by the end result no matter how much the end result matches what you wanted at the beginning.

From what you wrote it sounds like you're passionate about technology and AI is sucking the wind out of your sails. Honestly whether or not AI is here to stay, I'd say find a hobby involving making something in the real world because that can be what keeps you happy. Let work be work, roll with whatever the flow is at work and in the world that you can't redirect, but focus on making you happy

Like, I've been enjoying working on my model railroad. Seeing the little trains of yore trundle on their little tracks makes my brain happy. But what really makes me enjoy it is all of the work that goes into it, carefully curving the flex track, trimming it to length, then re-working and refining it so the trains run better. Super elevating the curves so the trains do that wonderful little tilty thing as they go around the sharp curves and theoretically derail less. Deep diving into overly-complicated wiring purely because I can! Avoiding pre-manufactured solutions for ones I can manually customize to my specific needs. Hand cutting, gluing and painting plastic card to make buildings. It's putting hours of time into building something and making it truly yours which is a joy that no AI can ever take from you!

[-] ZDL@lazysoci.al 5 points 6 days ago

You know, as someone who despises LLMbeciles in almost every field they get applied (I think once the hype dies down and the companies pushing them collapse that useful things will come from them, just not in the near future), I find myself in the uncomfortable position of disagreeing with someone's reasons for hating AI.

And note: I'm not saying you shouldn't hate AI, nor am I saying that your reasons aren't good reasons for you. I'm saying that your reasons may be symptomatic of something else; that your reasons have been said before about other technologies we all now take for granted, and that in a couple of places you're being extraordinarily self-absorbed.

Let's go over a few of these.

That instantly deflated me. This guy really hadn’t done anything other than forward my question, and that elusive keyboard of mine was one question away in some search engine. I should have been excited to finally find out what it was, but somehow instead it felt hollow and totally pointless all of the sudden.

To me this is utterly bizarre. Instead of feeling excited that something you'd looked for for ages was found, you got upset at how it was found. I'm sorry, I can't relate to this at all. I'm not saying you didn't feel this way nor even that you should feel this way, but to me it looks like you value years of frustration and failure over getting the thing you were looking for. This is hair shirt territory for me and I suspect to most people. Some of your other things you mention (c.f. below) I have some comprehension of and even sympathy for. But this one just strikes me as a low-grade form of masochism.

To see why, dial this conversation back to pre-1998 when search engines REALLY sucked (just like now!) and before Google came in and completely, radically changed the landscape (your searches actually FOUND things before Google enshittified!).

You've been searching for your keyboard on Ask Jeeves and Yahoo and couldn't find a thing. Then Google pops up and someone Googles it for you and up pops your keyboard. Are you similarly upset? If not, why not?

Fast forward a few years, the same thing happened with my builder the other day: I asked him if he could build a boardwalk in the backyard, so my disabled wife could go get some fresh air safely. I started explaining what I wanted and sketching things on a piece of paper. At some point, he simply got on his feet, whipped out his cellphone, shot a picture of the backyard and asked ChatGPT to draw a boardwalk to my specifications. And in 2 seconds flat, it came up with a photo of the finished thing. And for the second time, it felt totally hollow, and the whole project felt meaningless. It was what I wanted for sure, but it’s now how I wanted it - if that makes any sense.

Here's where I think you're being extraordinarily self-absorbed. Sure, you may enjoy tinkering around with paper sketches, talking back and forth, and generally puttering around with manual processes. (I know I do. I mean I'm learning to hand-carve signature chops in a world where I can give a manufacturer an image file, pay under five bucks, and get back a custom signature chop made of any material suitable from wood to stone to brass that is done to perfection on a CNC. But I still prefer to learn carving them by hand with manual chisels.)

But here there's more than you in the loop. There's the builder. Whose livelihood this is. They're not going to want to sit there and futz around with your fetish for hand-drawing sketchy diagrams (pun intended) when there's a tool that gets exactly the information he needs to actually perform his livelihood in seconds. To the builder time is money, and you're in effect bemoaning that he doesn't want to waste time/money to suit your preferences.

Genuine, if pointed, question here: where do you draw this line? If you insisted on, say, measuring everything with a 30cm ruler and he pulled out a tape measure would you be just as upset? Or does that sound ridiculous to you? If the latter, do some simple substitution and...

You are basically demanding here that someone who doesn't share your fetish for hand-made stuff take time out from his livelihood to suit you despite them being entirely uninterested in it. This goes beyond having a preference for yourself and into the realm of inflicting that preference on others; and here I sharply disagree.

This then leads to these ruminations:

For instance, I’m a fan of 360 photography. I take pride in reworking the nadir in all my shots (don’t ask…) AI could clean it up in seconds and probably do a better job than me with Gimp. But what’s the point? I don’t want to do that! It’s tempting, but then I’d be totally disinterested in the photo after AI is done cleaning it up. So I do it myself.

Likewise, if I have a technical problem, I’ll look for the answer the traditional way, with a search engine. I know I could probably ask Google’s AI thing and it would probably give me the right answer rightaway, but then what would I have learnt? And more importantly, what’s the point of learning anything if the answer is always there? So I refrain. And yet it’s tempting…

AI makes everything pointless and bland, and it leaves me empty and not wanting for more.

Absolutely nobody is telling you to use AI for your own stuff. (Or, rather, nobody whose opinion is worth hearing is. LLMbecile pushers are not people whose opinions are worth hearing.) You can continue doing things the way you enjoy doing them. Like I enjoy carving signature chops by hand. Like I genuinely enjoy writing with a dip pen. Like I enjoy running tabletop RPGs in person instead of playing computer games or using online tabletops. And how I like designing RPG scenarios myself instead of asking an LLMbecile to make a half-assed one, but with lots and lots of fancy verbiage, for me.

You're coming across here like, say, a painter saying "I don't want to use a Kodak camera!" in 1888. Or like a photographer saying "I don't want to use a Land camera!" in 1948. Nobody says you have to, just like nobody told the painter they had to become a photographer or told a photographer they had to put up with crappy, low-quality snapshots. LLMbeciles don't change anything for you unless you want them to. (Or unless they're forced upon you by your employer, which is also relatable and something I'd agree with opposing.)

(Also I find it really funny that you're opposed to using LLMbeciles for search but not opposed to using search engines when in my lifetime there were people sneering at using search engines instead of doing proper research in libraries. "Google-U" was a real insult used by people who Googled stuff instead of doing "proper research".)

That’s my beef with AI. That’s what I realized today: it’s a bright future I’m utterly indifferent to, that holds zero excitement for me.

But ... it's not for you. Even if LLMbeciles weren't a hot mess of terrible ethics, fascist creators, IP theft, and hallucinations, even if they were actually as useful as the pushers claim they are, it's clear that LLMbeciles aren't for you. And I can respect that.

But ... I'm entirely unexcited by video games. Should I be writing long opinion pieces on why video games are bad because they don't suit what I want from games? I'm entirely unexcited by automobiles. Should I be writing long opinion pieces on why gearheads are missing the point of life?

I guess I'm trying to say that I find the nature of your objection really ... weird. In the "old man shakes fist at clouds" sort of way.

Have you considered just not using LLMbeciles instead of letting them drain the joy from your life and damaging your liver?

Something that's obvious to me is that OP feels alienated by this technology because the interpersonal connections they would be making otherwise are now being supplanted by, I dunno, NPC people who are now just acting like interfaces for the same robot they could just be using themselves.

You know, I do feel for people who lament a bit the proliferation of self-checkouts and kiosks and tablets and mobile apps that make it easier and easier than ever before to never talk to another real fucking person in their lives. And I do think it's bad for our collective mental health, too.

There's the builder. Whose livelihood this is. They're not going to want to sit there and futz around with your fetish for hand-drawing sketchy diagram

Another thing that's obvious to me, you don't view the builder as an artist. An artist that sacrifices their artistic integrity to this AI, hollow proceduralism is not really an artist I want to hire.

If OP just wanted the boardwalk and nothing else, this probably wouldn't be an issue. But there is something missing here that you're currently blind to because your antipathy for 'LLMbeciles' is not coming from a pro-humanist perspective.

[-] thedeadwalking4242@lemmy.world 3 points 5 days ago

My biggest problem is that LLMs have mediocre quality at best. It's definitely a different world. But probably less different then you think.

Traditional forms, search engines, documentation all still have their place. Now it's just all that + LLMs. It's an additional tool but it doesn't replace the developer in the slightest

[-] Darkassassin07@lemmy.ca 4 points 5 days ago

It's an additional tool but it doesn't replace the developer in the slightest

That's the thing though; it should be an additional tool, but instead is being used as a replacement.

I started joking to a few friends that AI software development will turn me into a slop-jockey. I started using AI tools at work to keep my boss happy, and I could immediately feel my own indifference to the code. But then, when did I every really care about the code? Another CRUD service, another data processing pipeline, another web app that's a glorified redis cache... it's all the same anyway.

So I decided that work can force me to use whatever tools they want on their dime, but my time is for me. I started new hobbies where the goal is to learn by creating: Learn the fundamentals, work, identify a problem, research how people solve it, work again, etc. It's been slow, but I find I have stuck with these projects for much longer than I would have in the past. It feels like, if I'm going to spend all this time learning, I want something to show for it. To be able to take what I create, share it with people who care, and talk intelligently about what decisions I made and why.

A result of this is I'm (mostly) able to let go of my concern on where the profession is headed.

[-] E_coli42@lemmy.world 5 points 6 days ago

To play devils advocate, no one is forcing you to use AI for the things in life you want to do for fun. AI is a tool used to reduce the need for labor. You can still do your crosswords by hand, but let the machine do the ones you don't want to do.

[-] DJKJuicy@sh.itjust.works 4 points 6 days ago

But that's the point though. Progress makes things easier. That's the point of all civilization, to cooperatively make things easier.

When you eat an apple are you angry you didn't grow the apple? When you buy a car are you angry you didn't build the car yourself?

The problem is going to be how we handle this next step in human progress. Does it free us up to reach for even higher pursuits? Cancer hasn't been cured yet. There are still homeless people. There are still humans on this planet who die of starvation.

If the work that's required to keep our civilization humming has just been reduced by AI then we've freed up a ton of human resources to work towards making humanity even better. We now have humans whose output can be reallocated towards solving even bigger problems. Is that what's going to happen? Or will we descend i to a dystopia where all the potential of human civilization is squandered for profit?

[-] thatsTheCatch@lemmy.nz 9 points 6 days ago

Technological progress alone (AI included) is not going to solve poverty, homelessness, starvation, inequality, etc.

We have the ability to solve those now. We already know how to solve those problems (no AI needed). But as long as we have capitalism and heirarchies, much of humanity's labour is going to be spent toiling away at meaningless jobs instead of improving the human condition.

We are in the dystopia you fear we'll descend into.

[-] RaoulDook@lemmy.world 4 points 6 days ago

Right, those problems are easily solved with money (other than cancer). But the rich would rather blow away fortunes of cash chasing stupid robot dreams

load more comments (2 replies)
[-] haverholm@kbin.earth 7 points 6 days ago

If the work that's required to keep our civilization humming has just been reduced by AI then we've freed up a ton of human resources to work towards making humanity even better.

Uh-huh. Except those human resources aren't "freed up", they're now unemployed. Unless somehow the profits from "AI" flow from the super rich into a UBI (hint: the money never flow from the super rich) that's not an improvement for humanity as a whole.

[-] DJKJuicy@sh.itjust.works 4 points 6 days ago

I agree. Humans have the capacity and the skillset to solve almost every problem on this planet. All of it.

But we won't because we're too busy chasing government credits and trying to figure out who "deserves" food and healthcare and education.

[-] haverholm@kbin.earth 2 points 6 days ago

I wanted to say that this is a specifically US perspective, but European welfare states are really catching up (ie, "down") in a big way.

[-] G_M0N3Y_2503@lemmy.zip 1 points 6 days ago

I think this depends on the speed of AI, If it's a slow role out and unemployment/poverty is a slow boil over decades, it probably could be a manageable minority by the "super rich". But if it does happen quickly like the hype would have you believe, well that's a lot of unhappy people with not much to do or lose, which is quite a lot of bargaining power.

[-] haverholm@kbin.earth 2 points 5 days ago

I was never worried about "AI" being destructive or taking jobs; it's not about the efficiency of the technology. If that were the case, nobody would have anything to worry about. No amount of computing power can get these overrated Clippys to do an actual person's job.

The problem is the decision makers who buy the hype, and the power they wield over other people's lives. "AI" doesn't have to be able to replace human labour for thousands to be out of a job to it. All it takes to do that is a handful of CEOs believing that "AI" can save them money.

So no, I do not think this has anything to do with "the speed of AI". It's about pennypinching employers being suckered by the "AI" hype. But workers get sacked either way.

[-] Einskjaldi@lemmy.world 2 points 6 days ago

This is dune, where they chose to not use computers at all and to force the human mind to develop.

[-] discocactus@lemmy.world 1 points 5 days ago

The solution to this is to become interested in and occupied by something that AI can't do. Like shower tile.

[-] grrgyle@slrpnk.net 2 points 5 days ago

Or just do it yourself to spite generated solutions

Like computers are great at playing games but we haven't stopped doing that

[-] Aralakh@lemmy.ca 1 points 6 days ago

This is a great write up, thanks for the efforts of sharing your thoughts - I've felt the same. I'd highly recommend reading The AI Mirror by Shannon Vallor; which does a great job of addressing the feeling from a virtue ethicist's perspective, and is just a solid intro to the whole AI machination.

[-] maegul@lemmy.ml 1 points 6 days ago

Generally, I’m completely with you.

The questions this prompts for me …

Are there limits to what technologies can be aligned with a “healthy” human life and society?

I’m inclined to think so, which, if true, means that steering technological progress toward what’s “healthy” would totally make sense.

How adaptable are people over time/generations such that they can naively learn to tolerate poorer forms of society and technology? I’d say a lot, which makes the former question slippery. But, if true, suggests that maximising society should involve more experimentation and exploration over shorter inter-generational time lines …?

How privileged are we in this outlook of yours in being accustomed to controlling the solution and work from conception to materialisation? A work force of automation supervisors is maybe both viable and natural under capitalism (however dark)?

load more comments
view more: next ›
this post was submitted on 07 May 2026
139 points (94.8% liked)

Fuck AI

7012 readers
1551 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.

founded 2 years ago
MODERATORS