Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
Poor man's voice was stolen and now while he cannot use it anymore you make mean jokes :(
When I read striking actor Stephen Fry my brain responded Why yes he is! Rather!
Am I a bad person?
Found the Hugh Laurie!
Oh man I didn't realize that WASN'T what it meant until I read your comment...
I think it's important to remember how this used to happen.
AT&T paid voice actors to record phoneme groups in the 90s/2000s and have been using those recordings to train voice models for decades now. There are about a dozen AT&T voices we're all super familiar with because they're on all those IVR/PBX replacement systems we talk to instead of humans now.
The AT&T voice actors were paid for their time, and not offered royalties but they were told that their voices would be used to generate synthentic computer voices.
This was a consensual exchange of work, not super great long term as there's no royalties or anything and it's really just a "work for hire" that turns into a product... but that aside -- the people involved all agreed to what they were doing and what their work would be used for.
The ultimate problem at the root of all the generative tools is ultimately one of consent. We don't permit the arbitrary copying of things that are perceived to be owned by people, nor do we think it's appropriate to do things without people's consent with their "Image, likeness, voice, or written works."
Artists tell politicians to stop using their music all the time etc. But ultimately until we really get a ruling on what constitutes "derivative" works nothing will happen. An AI is effectively the derivative work of all the content that makes up the vectors that represents it so it seems a no brainer, but because it's radio on the internet we're not supposed to be mad at Napster for building it's whole business on breaking the law.
I don't think permits and concent alone can be used in labor relationship, because the unbalance position of power employees and employers have with each other. Could the workers really negotiate better working conditions? They really can't, not without an union anyway.
I think a more interesting (and less dubious) example of this would be Vocaloid and to a greater extent, cevio AI
Vocaloid is a synth bank where instead of the notes being musical instruments, they're phonemes which have been recorded and then packaged into a product which you pay for, which means royalties are involved (I think there might also be a thing with royalties for big performances and whatnot?) Cevio AI takes this a step further by using AI to better smooth together the phonemes and make pitching sound more natural (or not - it's an instrument, you can break it in interesting ways if you try hard enough). And obviously, they consented to that specific thing and get paid for it. They gave Yamaha/Sony/the general public a specific character voice and permission to use that specific voice.
(There's a FOSS voicebanks but that adds a different layer of complication to things like I think a lot of them were recorded before the idea of an "AI bank" was even a possibility. And like, while a paid voice bank is a proprietary thing, the open source alternatives are literally just a big file of .WAVs so it's much easier to go outside their intended purposes)
Studios basically want to own the personas of their actors so they can decouple the actual human from it and just use their images. There's been a lot of weird issues with this already in videogames with body capture and voice acting, and contracts aren't read through properly or the wording is vague, and not all agents know about this stuff yet. It's very dystopian to think your whole appearance and persona can be taken from you and commodified. I remember when Tupac's hologram performed at Coachella in 2012 and thinking how fucked up that was. You have these huge studios and event promoters appropriating his image to make money, and an audience effectively watching a performance of technological necromancy where a dead person is re-animated.
Did Tupac's estate agree? Or receive compensation?
Who cares if his estate agreed to it? HE didn't. His estate shouldn't have the right to make money off of things he never actually did.
Let the dead stay dead, it's just an excuse to not pay new, living artists.
It just the beginning for sure. This future will be the end of artists and still everyone will clapping to AI productions like fools.
"it wasn't me planning the terrorist attack over the phone, it was someone stealing my voice with an AI"
This is, unfortunately, the world we are about to be in.
This is the best summary I could come up with:
Among those warning about the technology’s potential to cause harm is British actor and author Stephen Fry, who told an audience at the CogX Festival in London on Thursday about his personal experience of having his identity digitally cloned without his permission.
Speaking at a news conference as the strike was announced, union president Fran Drescher said AI “poses an existential threat” to creative industries, and said actors needed protection from having “their identity and talent exploited without consent and pay.”
As AI technology has advanced, doctored footage of celebrities and world leaders—known as deepfakes—has been circulating with increasing frequency, prompting warnings from experts about artificial intelligence risks.
At a U.K. rally held in support of the SAG-AFTRA strike over the summer, Emmy-winning Succession star Brian Cox shared an anecdote about a friend in the industry who had been told “in no uncertain terms” that a studio would keep his image and do what they liked with it.
Oscar winner Matthew McConaughey told Salesforce CEO Marc Benioff during a panel event at this year’s Dreamforce conference that he had concerns about the rise of AI in Hollywood.
A spokesperson for the Alliance of Motion Picture and Television Producers (AMPTP), the entertainment industry’s official collective bargaining representative, was not available for comment when contacted by Fortune.
The original article contains 911 words, the summary contains 213 words. Saved 77%. I'm a bot and I'm open source!
His voice wasn't stolen, it's still right where he left it.
Fair enough. It's not theft, it's something else.
But that's just semantics, though.
The point is that his voice is being used without his permission, and that companies, profiteering people, and scammers will do so using his voice and the voices others. He likely wants some kind of law against this kind of stuff.
If you made a painting for me, and then I started making copies of it without your permission and selling them off, while I might not have stolen the physical painting, I have stolen your art.
Just because they didn't rip his larynx out of his throat, doesn't mean you can't steal someone's voice.
We're getting into samantics but it's counterfeit not stolen.
It would be more like if you made a painting for me, and I then used that to replicate your artistic style and used that to make new paintings without your permission and passed it off as your work.