[-] 200fifty@awful.systems 11 points 2 years ago* (last edited 2 years ago)

The bill mandates safety testing of advanced AI models and the imposition of “guardrails” to ensure they can’t slip out of the control of their developers or users and can’t be employed to create “biological, chemical, and nuclear weapons, as well as weapons with cyber-offensive capabilities.” It’s been endorsed by some AI developers but condemned by others who assert that its constraints will drive AI developers out of California.

Man, if I can't even build homemade nuclear weapons, what CAN I do? That's it, I'm moving to Nevada!

[-] 200fifty@awful.systems 10 points 2 years ago

Not even -- it's a simplified Civilization clone for mobile. (It actually sounds like a pretty neat little game, but, uh, chess it is not!)

[-] 200fifty@awful.systems 11 points 2 years ago

ngl his stuff always felt a bit cynical to me, in that it seemed to exist more to say "look, video games can have a deep message!" than it did to just have such a message in the first place. Like it existed more to gesture at the concept of meaningfulness rather than to be meaningful itself.

[-] 200fifty@awful.systems 11 points 2 years ago* (last edited 2 years ago)

I believe waitbutwhy came up before on old sneerclub though in that case we were making fun of them for bad political philosophy rather than bad ai takes

[-] 200fifty@awful.systems 11 points 2 years ago

It is always kind of bewildering to me though. Like, has no one ever explained to these people the health problems that highly-bred dogs tend to have? Have they never heard of 'hybrid vigor' or issues with smaller gene pools making populations more susceptible to disease? Were they just asleep during biology 101? I don't get how people who think they're so smart can have failed to consider even the most basic issues with planning to turn humanity into Gros Michel bananas.

[-] 200fifty@awful.systems 10 points 2 years ago* (last edited 2 years ago)

I mean they do throw up a lot of legal garbage at you when you set stuff up, I'm pretty sure you technically do have to agree to a bunch of EULAs before you can use your phone.

I have to wonder though if the fact Google is generating this text themselves rather than just showing text from other sources means they might actually have to face some consequences in cases where the information they provide ends up hurting people. Like, does Section 230 protect websites from the consequences of just outright lying to their users? And if so, um... why does it do that?

Even if a computer generated the text, I feel like there ought to be some recourse there, because the alternative seems bad. I don't actually know anything about the law, though.

[-] 200fifty@awful.systems 11 points 2 years ago

ok but for real... it's not great for finding actual answers to queries, but I find like 800x more interesting results with search.marginalia.nu than any other search engine. It's the only search engine that I find actively fun to just browse around on recreationally.

[-] 200fifty@awful.systems 11 points 2 years ago

I'm confused how this is even supposed to demonstrating "metacognition" or whatever? It's not discussing its own thought process or demonstrating awareness of its own internal state, it just said "this sentence might have been added to see if I was paying attention." Am I missing something here? Is it just that it said "I... paying attention"?

This is a thing humans already do sometimes in real life and discuss -- when I was in middle school, I'd sometimes put the word "banana" randomly into the middle of my essays to see if the teacher noticed -- so pardon me if I assume the LLM is doing this by the same means it does literally everything else, i.e. mimicking a human phrasing about a situation that occurred, rather than suddenly developing radical new capabilities that it has never demonstrated before even in situations where those would be useful.

[-] 200fifty@awful.systems 11 points 2 years ago

I feel like it was all over from the moment they made it talk in first person. No one had any illusions that Inferkit or NovelAI were general intelligences, because it was obvious that they were just language models autocompleting a sentence you typed in.

[-] 200fifty@awful.systems 10 points 2 years ago

It's like pickup artistry on a societal scale.

It really does illustrate the way they see culture not as, like, a beautiful evolving dynamic system that makes life worth living, but instead as a stupid game to be won or a nuisance getting in the way of their world domination efforts

[-] 200fifty@awful.systems 10 points 2 years ago

The problem is just transparency, you see -- if they could just show people the math that led them to determining that this would save X million more lives, then everyone would realize that it was actually a very good and sensible decision!

view more: ‹ prev next ›

200fifty

0 post score
0 comment score
joined 2 years ago