news
Welcome to c/news! Please read the Hexbear Code of Conduct and remember... we're all comrades here.
Rules:
-- PLEASE KEEP POST TITLES INFORMATIVE --
-- Overly editorialized titles, particularly if they link to opinion pieces, may get your post removed. --
-- All posts must include a link to their source. Screenshots are fine IF you include the link in the post body. --
-- If you are citing a twitter post as news please include not just the twitter.com in your links but also nitter.net (or another Nitter instance). There is also a Firefox extension that can redirect Twitter links to a Nitter instance: https://addons.mozilla.org/en-US/firefox/addon/libredirect/ or archive them as you would any other reactionary source using e.g. https://archive.today . Twitter screenshots still need to be sourced or they will be removed --
-- Mass tagging comm moderators across multiple posts like a broken markov chain bot will result in a comm ban--
-- Repeated consecutive posting of reactionary sources, fake news, misleading / outdated news, false alarms over ghoul deaths, and/or shitposts will result in a comm ban.--
-- Neglecting to use content warnings or NSFW when dealing with disturbing content will be removed until in compliance. Users who are consecutively reported due to failing to use content warnings or NSFW tags when commenting on or posting disturbing content will result in the user being banned. --
-- Using April 1st as an excuse to post fake headlines, like the resurrection of Kissinger while he is still fortunately dead, will result in the poster being thrown in the gamer gulag and be sentenced to play and beat trashy mobile games like 'Raid: Shadow Legends' in order to be rehabilitated back into general society. --
view the rest of the comments
Nah, AI is cool and at some point it will be good. There will soon come a time when any Chinese netizien could make their own marvel move and with a legal framework like this Hollywood will have no recourse
Akshully any new technology that takes work away from artists is automatically bad. That's why nobody on Hexbear uses a camera and commissions portraits from local artists out of principle instead.
Yeah, using a camera is exactly the same as buzzword “AI” that’s scraped millions of images of art for data without the consent of the artist, to better replicate that human made art without having to pay a human, just to maximize profit. Exactly the same. Fuck off
It's pretty wild to me that buzzword AI has made so many leftists do an 180 degree pivot and become huge fans of copyrights and intellectual property. Good thing that none of us pirate software or anime or movies regularly or else this sudden love for the consent of the artist would be really fucking nonsensical.
Piracy is not the same as IP fraud
IP should not exist. Period. Fuck the very concept of owning an idea.
Classes "should not exist," and yet the answer is not to simply abolish them but to use them for our purposes and reach a point where they no longer exist
I am the for regularization of AI, but I hate the "AI is stealing art" lie. What AI does is no different than a human looking at how other people draw to learn to draw like them. Nothing is being stolen.
The biggest difference is, when a human learns to draw, the new drawings that are created were created by a human artist and are expressing their human experience and perspective and emotions and ideas. There's an intelligent creator behind the new art that is being made.
These so-called "AI" have no thoughts. They have no ideas or perspectives or ideas. There's no more originality here than a funhouse mirror.
It's incredibly different, because humans can have experiences outside of the art they view and that becomes part of the art they make.
hmm so if the ai was trained on various e.g. stock photos in addition to people's art would u change your opinion
No? Stock photos are technically just other people's art? The point is that the "AI" we're currently talking about is INCAPABLE of anything other than reassembling other people's art.
If it could have its own experiences, it would be an entirely different thing and it would be unethical to exploit their labor. Current AI is just really efficient copying that covers its own tracks by copying A LOT at once. That's just what this technology is.
Typing in a prompt to "create art" with these is tantamount to image searching on google and claiming all the images are yours because you came up with the search term.
and I think you might be stretching the definition of copying here at least a bit. They're not copying pixels, they're identifying common features in images and encoding those into the internal network relationships, except not only the features themselves but also how they relate to each other etc
also point of order/etiquette is it rude to respond with two comments to two different points
A little, but we do it all the time
huh, what level of indirection would it require for photographs to not be art anymore? Would like, random street webcams do it?
I'm not sure I understand the question or how the scenario is comparable. A more apt comparison would be someone that goes around taking pictures of other people's art and starts claiming it as their own. You're free to take pictures of it, sure, but if you want to claim it as your own creation, you've cross a boundary that I'm not willing to cross with you. That's how I see "AI" art.
i'm pretty sure you could in fact take pictures of paintings, with some connecting theme or context & redisplay those photos as new art. the line between a 'new art' and a 'stolen art' is pretty difficult to define
Yeah we already had this particular debate 100 years ago tbh. there may have been a urinal involved
It's such a counter-productive property brained take too. Like no matter which way it swings it's a lose-lose: either the AI owner gets to functionally enclose (but in a non-exclusive way) the sum of available human art and profit off of an endless stream of low-grade procgen nonsense mimicking it, or they have to build their own private stables of training art and then they get to own and profit off their endless low-grade slop generator and it just takes a little longer and costs them a bit more.
Chasing the training data IP angle is just playing right into their hands, when what should be pushed for is to make generative AI a copyright poison pill that not only is inherently and immediately public domain itself, but also applies that to the entire work it's featured in and any licenses alongside it. Disney used a deepfake somewhere in a Star Wars movie? Boom, Star Wars in its entirety becomes public domain as punishment, as do any trademarks they stuck anywhere in the film like their fucking Mickey Mouse logo. Just straight up making using it at all completely untenable regardless of the ownership of the training data. Not because this is a logical way to set it up, but because taking a complete scorched earth approach to AI generated slop is the only acceptable solution under the capitalist system: let it be a fun toy for the average person to fuck around with, and a deadly poison to any corporate commodification.
Hell, apply that to the algorithm itself: any software providing generative AI becomes public domain, as do any patents that software uses as well. Just go fucking nuclear on the whole thing entirely.
Good thing no artist ever uses other art as inspiration to create their work... oh wait. Just like a human artist AI looks at a shit ton of reference material that influences the thing they make. AI might do it faster but people do the same thing and AI only they can only scrape digitized art where humans have the whole width and breadth of life and multiple senses they can use to influence their art.
If an AI can make art that can compete against you then you shouldn't be a professional artist just like if you cant make a better cup of coffee than a vending machine you shouldn't be a barista.
I would say the issue is a bit more Complex(TM).
In one case, if I use a generally available program like Stable Diffusion with a generally available model and generate images with it, should I be able to copyright the product because of the combination of prompts and configuration that I used? I am not definitive on this but I am leaning towards no.
On the other hand, if an organization uses a proprietary algorithm, trains a model on its using using stolen art, etc., they would have a more of a case of copyright in my opinion. My gut says "no" for a variety of reasons but there is more of a case.
Tldr: abolish copyright
Try to make something with AI. You can get stable diffusion to make something neat easily. It is a toy. It takes significant work to make it do what you want. For any task of specified complexity it is probably easier to get the art you want commissioned than to make it with AI. Maybe future AIs will be more user friendly. Look at corridor digitalis AI projects. They still take a professional team days of work to me something of middling quality.