163
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 24 Mar 2024
163 points (100.0% liked)
chapotraphouse
13911 readers
806 users here now
Banned? DM Wmill to appeal.
No anti-nautilism posts. See: Eco-fascism Primer
Slop posts go in c/slop. Don't post low-hanging fruit here.
founded 4 years ago
MODERATORS
It's an extremely powerful piece of productive capital that runs locally inside a fairly cheap piece of capital that most people can acquire. We can say "so powerful a device should never have been created!" but it was and it now exists as a piece of productive capital no less disruptive than countless other machines.
We aren't going to be rid of it now that it exists, so the only move that remains is to take hold of it, learn to use it, and exploit it - any artist picking it up immediately has an advantage over every gormless techbro dipshit that's just churning out nonsense without looking at it. Like every bit of aesthetic taste and ability to draw and edit image massively improves what one can do with a machine that cleans up sketchy linework and handles shading in seconds, while the vast majority of people using it are just hitting "generate" as a treat button and barking and clapping at the gibberish it spews out.
Like if you look at what techbros are doing, they love the dogshit pixar-style "I made a machine to generate shitty 3d blob art because I can't even be bothered to use generic assets in blender and do the most basic and braindead work ever" shit, or the "photorealism, but with oily brushstrokes and nightmare fuel JPEG error looking shit" style, which look awful and are almost impossible to fix up, but the AI is actually fairly competent at traditional art styles which are also trivial to clean up and edit since they're (comparatively) low-detail and abstract.
Seriously, if one looks at the people interacting with AI art right now most are just babbling at magic prompt machines someone else runs, then of the people involved enough to run it locally most are using simple prompt UIs, while the most complex thing anyone uses is comfyui, a braindead basic flowchart interface that's absurdly simple and easy to use, and most of the community cries about how it's too complicated and hard to use. Techbros are all talentless dipshits and anyone with a brain and art skills could take their toys and eat their lunch.
Yeah, under a socialist system this wouldn't even be a question, this would be a uniformly wonderful tool that would enable a level and scale of arts production never before imagined provided some selection was put into place as to what could actually get published. Like I've been poking at setting up a rotoscoping pipeline to see if it's at all possible to bridge quick and janky CGI and traditional cell animation with basic hand rotoscoping and an AI cleanup and detailing pass followed by interpolation over multiple frames with ebsynth or something, but I need a new stylus because the battery in mine died and it isn't one where that's replaceable.
This is inaccurate: it's much, much fuzzier than that, and is more about picking up and recombining concepts and aesthetics - it's weird and repetitive, but tends to be repetitive in the same way artists can be when they work out an approach to a pose and then just keep doing slight variations on it even when that doesn't make sense (like old comics did this a lot) or when they're sticking too close to a reference image. Toss in controlnet and guide it more and it breaks away from that.
I think the clearest refutation of the property angle is to look at two things: who has the power to claim ownership over the training data (hosting sites, major corporations, and social media sites) and whether or not the training data only being properly licensed ahead of time would make a difference in the harm the technology causes. Like who profits if we end up saying "AI trainers must pay royalties to the proper institutions"? Reddit, imgur, meta, deviantart, tumblr, etc, all of whom claim ownership over their users' posts and are already selling that access, because as far as they're concerned it's not the artists being infringed upon but a misuse of their hosting services. Similarly, if Disney or the like came out with an AI trained on its own private library of works and began replacing animators with it and renting it out to selected studios would that make it ethical? Of course not: it is unethical because of who uses it (techbros and corporations) and its consequences (devaluing skilled labor), not because it violates property rights.
Yep, and it's only going to get worse. Some solution to the AI art spam on social media will have to be found, but even worse is what the use of generative AI in professional environments is going to do. Animators are already overworked and underpaid, and that's only going to get worse when these tools get integrated into their workflows and one worker ends up expected to do the work of what now would be an entire team.
That's why I'm focusing on what this is: an extremely powerful and destructive piece of capital that already exists. We can't stop it from existing or stop capitalists from making things worse with it, all we can do is seize upon it and find ways to use it ourselves - that is try to predict how it's going to be put to work professionally and use it to enable and empower smaller independent teams of artists to do with consumer grade hardware what would previously have required a full studio with many millions of dollars worth of invested capital to accomplish.
In practice it's gonna be like a bigger version of what happened with the advent of easily accessible 3d rendering tools: that shit was truly awful and it infested everything, but gradually the low-grade stuff has become mostly filtered out and some professionals have emerged who actually use the medium well. Since there's no putting it back in the bag, the only thing left is to try to exploit it and springboard off it to new heights however possible.
A little tangentially, I've ironically found that it can be a good learning and practice tool in one particular way: spotting and fixing its mistakes. Like I originally learned how to edit images well about a decade ago when I went through and digitized and cleaned up hundreds of my grandfather's old slides, and trying to clean up an AI's mistakes has a similar feel to trying to clean up lines and mildew spots on an old photo. You have to think about why it's wrong and what technique you can do to fix it without making it more wrong, basically.
Yep. Everything capitalists will use it for is bad, and the response I've settled on is to encourage leftists and artists to try to take that ball and run with it, to exploit it the same way capitalists will but for the sake of independent works or agitprop instead. The only positive I see to it is that it's like if past skilled tradesmen being made obsolete by new tech could just summon up that capital for themselves, like if a simple hammer could have been made to also be an industrial press just by telling it how to be that, because that's what we can do with open-source AI so to speak - that wouldn't make the factories less bad, but it would have meant the factory owners lacked a monopoly on industrial capital.
dear fucking god please stop upholding capitalist ideas about IP rights.
Hot take: Artists should be able to not have their life's work automatically fed into the plagiarism machine without their compensation or consent. Like I'm not going to pretend that Mickey Mouse being copyrighted for a century is a normal thing, but people having their labor exploited for the profit of the wealthy is kinda the thing we're supposed to be against, no?
does art belong to everyone or not?
i think you shouldn't get to opt out of the remix machine but the corporations shouldn't ~~be able to exploit it for profit~~ exist. Having your work not become part of the commons is the same shit as a century of copyright. Anything we do about these generative models that allows corpos to continue to use them is a bandaid at best.
Well I disagree. You should have a fundamental right to opt out of these things. Even in a perfect world where everything is just and every artist can support themselves, I see no reason it shouldn't require the creator's consent. Surely, with no financial pressures to corrupt things, many creatives would willingly contribute to these models, and we wouldn't need to resort to this ugly, non-consensual scraping.
so i should be able to prevent my shit from entering the public domain? how is that different than the mouse?
I just think, fundamentally, there should be some level of control the artist has over these things. You asked me earlier if art should "belong to everyone", and I guess I don't think it should, at least not fully or without restriction. I'm not against stuff like fanart and fanfiction and things like that, not in the slightest, but the idea of having my work taken in that way, mechanistically, even in a non-artistic context, like the conversation we're having right now, feels so thoroughly violating that I just can't support it. It feels like in the minds of a lot of people, the only option an artist should have to avoid these things, to avoid being scraped, is to seclude themselves, or at least their work, and to completely shut people off from experiencing it. I don't want that, but I don't want to be scraped either. Is it so strange? Am I really the weird one for wanting a middle ground, where the humans are allowed to see me and the AI isn't?
i think that feeling is probably rooted in capitalism and precarity? whatever fan works you're imagining and fan works "with an advanced computer" are the same.
i do think we should have some protection against e.g. political candidates we don't endorse using our art, or corporations profiting from our work, but something automatic like how covers work in music seems pretty sane.
rare copyright law w.
if somebody wants to make art and not actually share it for metaphysical reasons i really don't respect that and don't think shit like city or asinine stunts should be validated, but that's a huge tangent.
Again, I'm not against any kind of voluntary arragement, but the first part of this comment, the first two sentences, just don't feel right to me. I'm writing an effortpost as we speak, maybe I'll put that up later. Still gotta organize my thoughts on that.
This doesn't sound far off from a Marxist understanding of it, honestly. It's built off the uncompensated labor of millions of artists. Once in its mature form I'm sure it could be fairly accurately modeled as an enclosure of commons at the very least.
???
how is it an enclosure? the "original" works are all exactly where they were when they were scraped. (and the "original" of a digital thing is a real fucken weird concept too, the first time that image existed was in the computer's RAM, or maybe the pixels in a particular state on the artist's monitor, the one that you see on the internet is like 5 generations of copy already)
like fuck these companies and the peter theil types behind them to death but superman 4-ing fractions of pennies from the take a penny tray isn't theft just because you do it a billion times, and copying isn't theft at all.
I think we're talking past each other. Your argument applies if we're talking about a liberal concept of ownership, or maybe judging the morality of ai, but that's unrelated to a material analysis of it. Generative technology requires massive datasets, which are the result of millions of hours of labor. This isn't a moral claim at all, it's simply trying to describe the mechanism at play.
Enclosure in the digital space isn't an exact parallel to enclosure acts in medieval England. I usually see it applied to the Open Source ecosystem: the products of volunteer labor are enclosed or harvested or adopted by private corporations. Google and Microsoft and Apple all built empires on this mechanism.
I mentioned a "mature" stage because I think the next step is more forceful enclosure and hoarding of datasets. The usability of the internet is quickly decreasing in lockstep with the development of AI, a dialectical evolution. It's eating away at the foundation upon which it builds itself.