Cursor's cofounder jumped in to try and quell the outrage. He is not pulling it off.
I've repeated this prediction a bajillion times, but I suspect this bubble's discredited the idea of artificial intelligence, and expect it to quickly die once this bubble bursts.
Between the terabytes upon terabytes of digital mediocrity the slop-nami's given us, LLMs' countless and relentless failures in logic and reason, the large-scale enshittification of daily life their mere existence has enabled, and their power consumption singlehandedly accelerating the climate crisis, I feel that the public's come to view computers as inherently incapable of humanlike cognition/creativity, no matter how many gigawatts they consume or oceans they boil.
Expanding on this somewhat, I suspect AI as a concept will likely also come to be seen as an inherently fascist concept.
With the current bubble's link to esoteric fascism, the far-right's open adoration of slop, basically everything about OpenAI's Studio Ghibli slopgen, and God-knows-what-else, the public's got plenty of reason to treat use or support of AI as a severe indictment of someone's character in and of itself - a "tech asshole signifier", to quote Baldur Bjarnason.
And, of course, AI as a concept will probably come to be viewed as inherently anti-art/anti-artist as well - considering how badly the AI bubble's shafted artists, and artists specifically, that kinda goes without saying.
Starting things off here with a couple solid sneers of some dipshit automating copyright infringement - one from Reid Southen, and one from Ed-Newton Rex:
I do feel like active anti-scraping measures could go somewhat further, though - the obvious route in my eyes would be to try to actively feed complete garbage to scrapers instead - whether by sticking a bunch of garbage on webpages to mislead scrapers or by trying to prompt inject the shit out of the AIs themselves.
Me, predicting how anti-scraping efforts would evolve
(I have nothing more to add, I just find this whole development pretty vindicating)
It was a pretty good comment, and pointed out one of the possible risks this AI bubble can unleash.
I've already touched on this topic, but it seems possible (if not likely) that copyright law will be tightened in response to the large-scale theft performed by OpenAI et al. to feed their LLMs, with both of us suspecting fair use will likely take a pounding. As you pointed out, the exploitation of fair use's research exception makes it especially vulnerable to its repeal.
On a different note, I suspect FOSS licenses (Creative Commons, GPL, etcetera) will suffer a major decline in popularity thanks to the large-scale code theft this AI bubble brought - after two-ish years of the AI industry (if not tech in general) treating anything publicly available as theirs to steal (whether implicitly or explicitly), I'd expect people are gonna be a lot stingier about providing source code or contributing to FOSS.
neil turkewitz coming in with a wry comment about AI's legal issues:
And, because this is becoming so common, another sidenote from me:
With the large-scale art theft that gen-AI has become thoroughly known for, how the AI slop it generates has frequently directly competed with its original work (Exhibit A), the solid legal case for treating the AI industry's Biblical-scale theft as copyright infringement and the bevvy of lawsuits that can and will end in legal bloodbaths, I fully expect this bubble will end up strengthening copyright law a fair bit, as artists and megacorps alike endeavor to prevent something like this ever happening again.
Precisely how, I'm not sure, but to take a shot in the dark I suspect that fair use is probably gonna take a pounding.
In other news, an AI booster got publicly humilitated after prompting complete garbage and mistaking it for 8-bit animation:
And now, another sidenote, because I really like them apparently:
This is gut instinct like my previous sidenote, but I suspect that this AI bubble will cause the tech industry (if not tech as a whole) to be viewed as fundamentally hostile to artists and fundamentally lacking in art skills/creativity, if not outright hostile to artists and incapable of making (or even understanding) art.
Beyond the slop-nami flooding the Internet with soulless shit whose creation was directly because of tech companies like OpenAI, its also given us shit like:
-
Google's unholy 'Dear Sydney' ad, and the nuclear backlash it got.
-
Apple crushing human creativity for personal gain and being forced to apologise for it
-
Mira Murati openly shitting on artists as gen-AI steals their artwork and destroys their livelihoods
-
Gen-AI boosters producing complete shit and calling it gold (with Proper Prompter and Luma Labs providing excellent examples)
-
And so much goddamn more, most of which I've likely forgotten
New piece from Brian Merchant: Yes, the striking dockworkers were Luddites. And they won.
Pulling out a specific paragraph here (bolding mine):
I was glad to see some in the press recognizing this, which shows something of a sea change is underfoot; outlets like the Washington Post, CNN, and even Inc. Magazine all published pieces sympathizing with the longshoremen besieged by automation—and advised workers worried about AI to pay attention. “Dockworkers are waging a battle against automation,” the CNN headline noted, “The rest of us may want to take notes.” That feeling that many more jobs might be vulnerable to automation by AI is perhaps opening up new pathways to solidarity, new alliances.
To add my thoughts, those feelings likely aren't just that many more jobs are at risk than people thought, but that AI is primarily, if not exclusively, threatening the jobs people want to do (art, poetry, that sorta shit), and leaving the dangerous/boring jobs mostly untouched - effectively the exact opposite of the future the general public wants AI to bring them.
There's gonna be laws passed as a result of this - calling it right now.
Not a sneer, but an observation on the tech industry from Baldur Bjarnason, plus some of my own thoughts:
I don’t think I’ve ever experienced before this big of a sentiment gap between tech – web tech especially – and the public sentiment I hear from the people I know and the media I experience.
Most of the time I hear “AI” mentioned on Icelandic mainstream media or from people I know outside of tech, it’s being used as to describe something as a specific kind of bad. “It’s very AI-like” (“mjög gervigreindarlegt” in Icelandic) has become the talk radio short hand for uninventive, clichéd, and formulaic.
Baldur has pointed that part out before, and noted how its kneecapping the consumer side of the entire bubble, but I suspect the phrase "AI" will retain that meaning well past the bubble's bursting. "AI slop", or just "slop", will likely also stick around, for those who wish to differentiate gen-AI garbage from more genuine uses of machine learning.
To many, “AI” seems to have become a tech asshole signifier: the “tech asshole” is a person who works in tech, only cares about bullshit tech trends, and doesn’t care about the larger consequences of their work or their industry. Or, even worse, aspires to become a person who gets rich from working in a harmful industry.
For example, my sister helps manage a book store as a day job. They hire a lot of teenagers as summer employees and at least those teens use “he’s a big fan of AI” as a red flag. (Obviously a book store is a biased sample. The ones that seek out a book store summer job are generally going to be good kids.)
I don’t think I’ve experienced a sentiment disconnect this massive in tech before, even during the dot-com bubble.
Part of me suspects that the AI bubble's spread that "tech asshole" stench to the rest of the industry, with some help from the widely-mocked NFT craze and Elon Musk becoming a punching bag par excellence for his public breaking-down of Twitter.
(Fuck, now I'm tempted to try and cook up something for MoreWrite discussing how I expect the bubble to play out...)
BlueMonday1984
0 post score0 comment score
Well, its a perfect demonstration that LLMs flat-out do not think like us. Even a goddamn five-year old could work this shit out with flying colours.