this post was submitted on 21 Jan 2024
826 points (95.0% liked)

Technology

59119 readers
2192 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 89 points 9 months ago (5 children)

The tool's creators are seeking to make it so that AI model developers must pay artists to train on data from them that is uncorrupted.

That's not something a technical solution will work for. We need copyright laws to be updated.

[–] [email protected] 25 points 9 months ago (1 children)
[–] [email protected] 4 points 9 months ago (1 children)

Yeah, that's what I'm saying - our current copiright laws are insufficient to deal with AI art generation.

[–] [email protected] 11 points 9 months ago (1 children)

They aren't insufficient, they are working just fine. In the US, fair use balances the interests of copyright holders with the public’s right to access and use information. There are rights people can maintain over their work, and the rights they do not maintain have always been to the benefit of self-expression and discussion. We shouldn't be trying to make that any worse.

[–] [email protected] 11 points 9 months ago (2 children)

Yep. Copyright should not include "viewing or analyzing the picture" rights. Artists want to start charging you or software to even look at their art they literally put out for free. If u don't want your art seen by a person or an AI then don't publish it.

[–] [email protected] 3 points 9 months ago (1 children)

Copyright should absolutely include analyzing when you're talking about AI, and for one simple reason: companies are profiting off of the work of artists without compensating them. People want the rewards of work without having to do the work. AI has the potential to be incredibly useful for artists and non artists alike, but these kinds of people are ruining it for everybody.

What artists are asking for is ethical sourcing for AI datasets. We're talking paying a licensing fee or using free art that's opt-in. Right now, artists have no choice in the matter - their rights to their works are being violated by corporations. Already the music industry has made it illegal to use songs in AI without the artist's permission. You can't just take songs and make your own synthesizer out of them, then sell it. If you want music for something you're making, you either pay a licensing fee of some kind (like paying for a service) or use free-use songs. That's what artists want.

When an artist, who does art for a living, posts something online, it's an ad for their skills. People want to use AI to take the artist out of the equation. And doing so will result in creativity only being possible for people wealthy enough to pay for it. Much of the art you see online, and almost all the art you see in a museum, was paid for by somebody. Van Gogh died a poor man because people didn't want to buy his art. The Sistine Chapel was commissioned by a Pope. You take the artist out of the equation and what's left? Just AI art made as a derivative of AI art that was made as a derivative of other art.

[–] [email protected] 1 points 9 months ago (1 children)

You should check out this article by Kit Walsh, a senior staff attorney at the EFF. The EFF is a digital rights group who recently won a historic case: border guards now need a warrant to search your phone.

[–] [email protected] 3 points 9 months ago (1 children)

MidJourney is already storing pre-rendered images made from and mimicking around 4,000 artists' work. The derivative works infringement is already happening right out in the open.

[–] [email protected] 3 points 9 months ago (1 children)

Something being derivative doesn't mean it's automatically illegal or improper.

First, copyright law doesn’t prevent you from making factual observations about a work or copying the facts embodied in a work (this is called the “idea/expression distinction”). Rather, copyright forbids you from copying the work’s creative expression in a way that could substitute for the original, and from making “derivative works” when those works copy too much creative expression from the original.

Second, even if a person makes a copy or a derivative work, the use is not infringing if it is a “fair use.” Whether a use is fair depends on a number of factors, including the purpose of the use, the nature of the original work, how much is used, and potential harm to the market for the original work.

Even if a court concludes that a model is a derivative work under copyright law, creating the model is likely a lawful fair use. Fair use protects reverse engineering, indexing for search engines, and other forms of analysis that create new knowledge about works or bodies of works. Here, the fact that the model is used to create new works weighs in favor of fair use as does the fact that the model consists of original analysis of the training images in comparison with one another.

You are expressly allowed to mimic others' works as long as you don't substantially reproduce their work. That's a big part of why art can exist in the first place. You should check out that article I linked.

[–] [email protected] 2 points 9 months ago (1 children)

I actually did read it, that's why I specifically called out MidJourney here, as they're one I have specific problems with. MidJourney is currently caught up in a lawsuit partly because the devs were caught talking about how they launder artists' works through a dataset to then create prompts specifically for reproducing art that appears to be made by a specific artist of your choosing. You enter an artist's name as part of the generating parameters and you get a piece trained on their art. Essentially using LLM to run an art-tracing scheme while skirting copyright violations.

I wanna make it clear that I'm not on the "AI evilllll!!!1!!" train. My stance is specifically about ethical sourcing for AI datasets. In short, I believe that AI specifically should have an opt-in requirement rather than an opt-out requirement or no choice at all. Essentially creative commons licensing for works used in data sets, to ensure that artists are duly compensated for their works being used. This would allow artists to license out their portfolios for use with a fee or make them openly available for use, however they see fit, while still ensuring that they still have the ability to protect their job as an artist from stuff like what MidJourney is doing.

[–] [email protected] 2 points 9 months ago (1 children)

I actually did read it, that’s why I specifically called out MidJourney here, as they’re one I have specific problems with. MidJourney is currently caught up in a lawsuit partly because the devs were caught talking about how they launder artists’ works through a dataset to then create prompts specifically for reproducing art that appears to be made by a specific artist of your choosing. You enter an artist’s name as part of the generating parameters and you get a piece trained on their art. Essentially using LLM to run an art-tracing scheme while skirting copyright violations.

I'm pretty sure that's all part of the discovery from the same case where Midjourney is named as a defendant along with Stability AI, it isn't its own distinct case. It's also not illegal or improper to do what they are doing. They aren't skirting copyright law, it is a feature explicitly allowed by it so that you can communicate without the fear of reprisals. Styles are not something protected by copyright, nor should they be.

I wanna make it clear that I’m not on the “AI evilllll!!!1!!” train. My stance is specifically about ethical sourcing for AI datasets. In short, I believe that AI specifically should have an opt-in requirement rather than an opt-out requirement or no choice at all. Essentially creative commons licensing for works used in data sets, to ensure that artists are duly compensated for their works being used. This would allow artists to license out their portfolios for use with a fee or make them openly available for use, however they see fit, while still ensuring that they still have the ability to protect their job as an artist from stuff like what MidJourney is doing.

You can't extract compensation from someone doing their own independent analysis for the aim of making non-infringing novel works, and you don't need licenses or permission to exercise your rights. Singling out AI in this regard doesn't make sense because it isn't a special system in that regard. That would be like saying dolphin developers have to pay Nintendo every time someone downloads their emulator.

[–] [email protected] 1 points 9 months ago (3 children)

You do realize that you basically just confirmed every fear that artists have over AI, right? That they have no rights or protections to prevent anybody from coming along and using their work to train an LLM to create imitation works for cheaper than they can possibly charge for their work, thereby putting them out of business? Because in the end, a professional in any field is nothing more than the sum of the knowledge and experience they've accrued over their career; a "style" as you and MidJourney put it. And so long as somebody isn't basically copy+pasting a piece, then it's not violating copyright, because it's not potentially harming the market for the original piece, even if it is potentially harming the market for the creator of said piece.

The Dolphin analogy is also incorrect (though an interesting choice considering they got pulled from the Steam store after the threat of legal action by Nintendo, but I think you and I feel the same way on that issue - Dolphin has done nothing wrong). A better analogy would be if Unreal created an RPGMaker style tool for generating an entire game of any genre you want in Unreal Engine at the push of a button by averaging a multitude of games across different genres to generate the script. If they didn't get permission to use said games, either by paying a one time fee, an ongoing fee, or using games that expressly give permission for said use, I'm sure the developers/publishers would be rather unhappy with Unreal. Could it be incredibly beneficial and vastly improve the process of creating games for the industry? Absolutely. If they released it for free, could it be used by anybody and everybody to make imitation Ubisoft games, or any other developer, and run the risk of strangling the industry with even more trash games with no soul in them? Also absolutely. And a big AAA publisher has a lot more ability to deal with knock-offs/competition like that than your average starving artist. The indie game scene is the strongest it's ever been thanks to the rise of digital storefronts, but how many great indie game developers go under after producing their first game and never make a second? The vast majority. Because indie games almost never make a profit, meaning they can't afford to make another.

The issue with AI is that it opens a whole can of worms in the form of creating an industrial scale imitation generator that anybody can use at the push of a button. And the general public have long made known their disdain for properly compensating artists for the work that they do, and have already been gleefully doing a corporation by using AI to avoid having to hire artists. This runs the risk of creating a chilling effect in the field of creativity and the arts, as your average independent artist can no longer afford to keep doing art thanks to the wonders of capitalism. There will always be people who do art as a hobby, but professional artists as we think of them today? Why go into debt by training at an art school if all your job prospects have been replaced because people generate art for free with some form of LLM instead of hiring artists. I myself never went into art beyond a hobby level despite wanting to because of how abysmal the job prospects were even 15 years ago. And I simply cannot afford to do it as much as I'd like (if at all) between work, the time investment, and the expense of it. And that's not even getting into the issues of LLM generated porn of people, advertisements generated using the voices of dead (and still alive) celebrities, scams made using the voices of relatives, and all the other ethical issues.

I used to work at a fish market with a kid who was a trained electrician who was set to follow in the footsteps of his dad who had been one of the highest paid electricians in the US, except he gave up on it because the thing he liked doing the most in the field was replaced with a machine by the time he graduated from technical school. Obviously the machine is more efficient (and probably safer), but instead of entering the field at all, he ended up working a job he hated and to this day has never found a job he has any passion for. What happens to art when professional artists are only NEETs, who have minimal living expenses, and those hired by corporations and the wealthy? Are we going to get the fine art market on steroids, with the masses only having access to AI generated art that will degrade in quality over time as the only new inputs are previous AI generated pieces, unless there's enough hobby artists to provide sufficient new art, while the wealthy hold a monopoly on human-made art that the rest of us will probably never see?

This is all pure speculation, but it's the Jurassic Park question: "Your scientists were so preoccupied with whether or not they could, they didn't stop to think if they should."

load more comments (3 replies)
[–] [email protected] 2 points 9 months ago* (last edited 9 months ago)

It's sad some people feel that way. That kind of monopoly on expression and ideas would only serve to increase disparities and divisions, manipulate discourse in subtle ways, and in the end, fundamentally alter how we interact with each other for the worse.

What they want would score a huge inadvertent home run for corporations and swing the doors open for them hindering competition, stifling undesirable speech, and monopolizing spaces like nothing we’ve seen before. There are very good reasons we have the rights we have, and there's nothing good that can be said about anyone trying to make them worse.

Also, rest assured they'd collude with each other and only use their new powers to stamp out the little guy. It'll be like American ISPs busting attempts at municipal internet all over again.

[–] [email protected] 7 points 9 months ago (1 children)

Disney lawyers just started salivating

[–] [email protected] 11 points 9 months ago (1 children)

Seems like Disney is as eager to adopt this technology as anyone

A few goofy Steamboat Willie knock offs pale beside the benefit of axing half your art department every few years, until everything is functionally a procedural generation.

[–] [email protected] 9 points 9 months ago (1 children)

They're playing both sides. Who do you think wins when model training becomes prohibitively expensive to for regular people? Mega corporations already own datasets, and have the money to buy more. And that's before they make users sign predatory ToS allowing them exclusive access to user data, effectively selling our own data back to us.

Regular people, who could have had access to a competitive, corporate-independent tool for creativity, education, entertainment, and social mobility, would instead be left worse off and with less than where we started.

[–] [email protected] 2 points 9 months ago (1 children)

Who do you think wins when model training becomes prohibitively expensive to for regular people?

We passed that point at inception. Its always been more efficient for Microsoft to do its training at a 10,000 Petaflop giga-plant in Iowa than for me to run Stable Diffusion on my home computer.

Regular people, who could have had access to a competitive, corporate-independent tool for creativity, education, entertainment, and social mobility

Already have that. It's called a $5 art kit from Michael's.

This isn't about creation, its about trade and propagation of the finished product within the art market. And its here that things get fucked, because my beautiful watercolor that took me 20 hours to complete isn't going to find a buyer that covers half a week's worth of living expenses, so long as said market place is owned and operated by folks who want my labor for free.

AI generation serves to mine the market at near-zero cost and redistribute the finished works for a profit.

Copyright/IP serves to separate the creator of a work from its future generative profits.

But all this ultimately happens within the context of the market itself. The legal and financial mechanics of the system are designed to profit publishers and distributors at the expense of creatives. That's always been true and the latest permutation in how creatives get fucked is merely a variation on a theme.

instead be left worse off and with less than where we started.

AI Art does this whether or not its illegal, because it exists to undercut human creators of content by threatening them with an inferior-but-vastly-cheaper alternative.

The dynamic you're describing has nothing to do with AI's legality and everything to do with Disney's ability to operate as monopsony buyer of bulk artistic product. The only way around this is to break Disney up as a singular mass-buyer of artwork, and turn the component parts of the business over to the artists (and other employees of the firm) as an enterprise that answers to and profits the people generating the valuable media rather than some cartel of third-party shareholders.

[–] [email protected] 5 points 9 months ago

We passed that point at inception. Its always been more efficient for Microsoft to do its training at a 10,000 Petaflop giga-plant in Iowa than for me to run Stable Diffusion on my home computer.

You don't need industrial level efficiency or insane overhead costs, that's why it's a big deal. It's something regular people can do at home.

Already have that. It’s called a $5 art kit from Michael’s.

An art set from Michaels can only do so much. Having access to the most cutting edge tools and techniques has always propelled artists and art forward. Imagine not having access to digital art tools, computer animation, digital photography, digital sculpting, and interactive media tools to expand artistic expression, and allow for the creation of new forms, styles, and genres of art that weren't possible before?

Copyright/IP serves to separate the creator of a work from its future generative profits.

But all this ultimately happens within the context of the market itself. The legal and financial mechanics of the system are designed to profit publishers and distributors at the expense of creatives. That’s always been true and the latest permutation in how creatives get fucked is merely a variation on a theme.

Fighting their fight for them won't help in the end, don't make it easier for them.

AI Art does this whether or not its illegal, because it exists to undercut human creators of content by threatening them with an inferior-but-vastly-cheaper alternative.

It isn't necessarily a competitor or a threat, the tools are open source and free for all artists to use to enhance their creative process, explore new possibilities, and imagine novel outcomes. You can use it to help you reach new audiences, and discover new forms of expression. It's not a zero-sum game like you suggest.

The dynamic you’re describing has nothing to do with AI’s legality and everything to do with Disney’s ability to operate as monopsony buyer of bulk artistic product. The only way around this is to break Disney up as a singular mass-buyer of artwork, and turn the component parts of the business over to the artists (and other employees of the firm) as an enterprise that answers to and profits the people generating the valuable media rather than some cartel of third-party shareholders.

That would still leave the baby-disneys with way more money than your average Joe, solving nothing. Training models isn't so expansive that they wouldn't enough have the money to train their own, that cost is only prohibitive to the working man.

[–] [email protected] 7 points 9 months ago (3 children)

The issue is simply reproduction of original works.

Plenty of people mimic the style of other artists. They do this by studying the style of the artist they intend to mimic. Why is it different when a machine does the same thing?

[–] [email protected] 3 points 9 months ago

No, the issue is commercial use of copirighted material as data to train the models.

[–] [email protected] 0 points 9 months ago

It's different because a machine can be replicated and can produce results at a rate that hundreds of humans can't match. If a human wants to replicate your art style, they have to invest a lot of time into learning art and practicing your style. A machine doesn't have to do these things.

This would be fine if we weren't living in a capitalist society, but since we do, this will only result in further transfer of assets towards the rich.

[–] [email protected] 0 points 9 months ago (2 children)

It's not. People are just afraid of being replaced, especially when they weren't that original or creative in the first place.

[–] [email protected] 3 points 9 months ago

Honestly, it extends beyond creative works.

OpenAI should not be held back from subscribing to a research publication, or buying college textbooks, etc. As long as the original works are not reproduced and the underlying concepts are applied, there are no intellectual property issues. You can't even say the commercial application of the text is the issue, because I can go to school and use my knowledge to start a company.

I understand that in some select scenarios, ChatGPT has been tricked into outputting training data. Seems to me they should focus on fixing that, as it would avoid IP issues moving forward.

[–] [email protected] 0 points 9 months ago* (last edited 9 months ago)

AI image creation tools are apparently both artistically empty, incapable of creating anything artistically interesting, and also a existential threat to visual artists. Hmm, wonder what this says about the artistic merits of the work of furry porn commission artist #7302.

Retail workers can be replaced with self checkout, translators can be replaced with machine translation, auto workers can be replaced with robotic arms, specialist machinists can be replaced with CNC mills. But illustrators must be where we draw the line.

[–] [email protected] 0 points 9 months ago (3 children)

copyright laws need to be abolished

[–] [email protected] 26 points 9 months ago (19 children)

That would make it harder for creative people to produce things and make money from it. Abolishing copyright isn't the answer. We still need a system like that.

A shorter period of copyright, would encourage more new content. As creative industries could no longer rely on old outdated work.

load more comments (19 replies)
[–] [email protected] 14 points 9 months ago (11 children)

That would be an update, not sure it would be a good thing. As an artist I want to be able to tell where my work is used and where not. Would suck to find something from me used in fascist propaganda or something.

load more comments (11 replies)
[–] [email protected] 3 points 9 months ago* (last edited 9 months ago) (1 children)

Truly a "Which Way White Man" moment.

I'm old enough to remember people swearing left, right, and center that copyright and IP law being aggressively enforced against social media content has helped corner the market and destroy careers. I'm also well aware of how often images from DeviantArt and other public art venues have been scalped and misappropriated even outside the scope of modern generative AI. And how production houses have outsourced talent to digital sweatshops in the Pacific Rim, Sub-Saharan Africa, and Latin America, where you can pay pennies for professional reprints and adaptations.

It seems like the problem is bigger than just "Does AI art exist?" and "Can copyright laws be changed?" because the real root of the problem is the exploitation of artists generally speaking. When exploitation generates an enormous profit motive, what are artists to do?

[–] [email protected] 1 points 9 months ago

What is a "which way white man" moment?

[–] [email protected] -1 points 9 months ago* (last edited 9 months ago)

They dutifully note that, this is the next best thing.