"Fund my company and your child might live to adulthood and/or have sperm that glows green."
Am I reading this right? Is he suggesting doing experiments on the suicidally depressed?
One thing he failed to mention here is that all software fails eventually. So imagine one of those fancy graphics scenes he commissioned being a 100+ car pileup because a software glitch caused a car to make a sudden right-hand turn to a street that doesn't exist.
This wasn't a free win, however. The reason Harris took over in the first place was because Biden's performance in the debate was poor enough that the Democrats thought there was no longer any chance with him, and this was already with lowering approval ratings. Had he stayed, I bet Trump's victory would have been even wider.
Awful.systems defaults to dark mode.
Newsom also signed AB 1008, which clarifies that any personal data fed to an AI model retains the same privacy rights it would otherwise — including the consumer’s right to correct and delete their personal information. That’ll be a fun one to implement.
I think what it actually clarified is that personal information generated from an AI model are now covered under the law, instead of just what is used as training data.
Investors demand growth. The problem is that Microsoft has basically won Capitalism and has no real area to grow to.
Obligatory note that, speaking as a rationalist-tribe member, to a first approximation nobody in the community is actually interested in the Basilisk and hasn’t been for at least a decade.
Sure, but that doesn't change that the head EA guy wrote an OP-Ed for Time magazine that a nuclear holocaust is preferable to a world that has GPT-5 in it.
60% of the time, it works 100% of the time.
Probably would have been easier when the context window wasn't 128k.
Though what the point would be should someone actually achieve that eludes me a bit.
ShakingMyHead
0 post score0 comment score
Also if you're worried about digital clone's being tortured, you could just... not build it. Like, it can't hurt you if it never exists.
Imagine that conversation:
"What did you do over the weekend?"
"Built an omnicidal AI that scours the internet and creates digital copies of people based on their posting history and whatnot and tortures billions of them at once. Just the ones who didn't help me build the omnicidal AI, though."
"WTF why."
"Because if I didn't the omnicidal AI that only exists because I made it would create a billion digital copies of me and torture them for all eternity!"
Like, I'd get it more if it was a "We accidentally made an omnicidal AI" thing, but this is supposed to be a very deliberate action taken by humanity to ensure the creation of an AI designed to torture digital beings based on real people in the specific hopes that it also doesn't torture digital beings based on them.