Wouldn't whoever is running the bot just be able to do a quick .replace() before feeding the raw data into it? What would be funnier is to get the bot to generate the string itself and then stop, without it being obiously present in the input. :)
Just got here to post this link, I'm too slow!
Thanks for the replies, I guess the "good" was vague on purpose, to see how people interpret it...
This popped up on one of my feeds today and I saved it, can't remember from where, it's relevant to the above so sharing here: https://oneproject.org/ai-commons/ (AI Commons: nourishing alternatives to Big Tech monoculture).
They talk about AI for good, at some point they mention how the term is sometimes used just for marketing.
Nuclear security? Should be fine, as long as it doesn't involve counting letters in any words.
Knowledge distilation is training a smaller model to mimic the outputs of a larger model. You don't need to use the same training set that was used to train the larger model (the whole internet or whatever they used for chatgpt), but can use a transfer set.
Here's a reference: Hinton, Geoffrey. "Distilling the Knowledge in a Neural Network." arXiv preprint arXiv:1503.02531 (2015)., https://arxiv.org/pdf/1503.02531
NextElephant9
0 post score0 comment score
Ran into these two videos today: AI Crash Report: The Money Furnace, AI Crash Report: The Physics of the Collapse.
Full of generated images, but seems to carry a good message.