this post was submitted on 18 Feb 2024
-152 points (21.9% liked)
Comic Strips
12737 readers
2878 users here now
Comic Strips is a community for those who love comic stories.
The rules are simple:
- The post can be a single image, an image gallery, or a link to a specific comic hosted on another site (the author's website, for instance).
- The comic must be a complete story.
- If it is an external link, it must be to a specific story, not to the root of the site.
- You may post comics from others or your own.
- If you are posting a comic of your own, a maximum of one per week is allowed (I know, your comics are great, but this rule helps avoid spam).
- The comic can be in any language, but if it's not in English, OP must include an English translation in the post's 'body' field (note: you don't need to select a specific language when posting a comic).
- Politeness.
- Adult content is not allowed. This community aims to be fun for people of all ages.
Web of links
- [email protected]: "I use Arch btw"
- [email protected]: memes (you don't say!)
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
You’re purposefully downplaying and over simplifying what AI models do. I’m not going to continue arguing with someone who can’t debate fairly.
Learning models don’t fucking collage shit. That’s not how the tech works.
I’m not going to debate this shit with someone who’s this blatant with their bad faith argumentation as you are being, good bye.
Anyone else wants to actually discuss or learn more about the tech in a civil way, lmk.
I know perfectly well how the tech works. It's given a bunch of images and randomly rolls dice to generate weights until it can generate things that approximate its training data, then continues in that general direction using a hill climbing algorithm to approximate that data as closely as possible. Every output of a generative neural network is a combination of random noise and a pattern of pixels that appeared in its training data (possibly across several input images, but that appeared nonetheless). You cannot get something out that did not, at some point, go in. Legally speaking, that makes them a collage tool.
I ask again: do you have an argument or are you going to continue to make appeals to ignorance against mine?