200fifty

joined 1 year ago
[–] [email protected] 6 points 6 months ago

I beg your pardon?

[–] [email protected] 8 points 6 months ago

Wow, I guess humans and LLMs aren't so different after all!

[–] [email protected] 7 points 6 months ago (4 children)

Psst, check the usernames of the people in this thread!

[–] [email protected] 10 points 6 months ago

yes, computing systems use energy. If our energy grid is overly reliant on the burning of fossil fuels that release harmful emissions, that doesn’t mean we need to stop the advancement of our computers. It means we need to stop using so much fossil fuels in our grid.

Now where have I heard something like this before? I'm trying to think of something, but I just can't quite seem to remember...

[–] [email protected] 19 points 6 months ago* (last edited 6 months ago)

RationalWiki is an index maintained by the Rationalist community

Lies and slander! I get why he'd assume this based on the name, but it would be pretty funny if the rationalists were responsible for the rational wiki articles on Yudkowsky et al, since iirc they're pretty scathing

[–] [email protected] 29 points 6 months ago (3 children)

"I know not with what technology GPT-6 will be built, but GPT-7 will be built with sticks and stones" -Albert Einstein probably

[–] [email protected] 8 points 6 months ago* (last edited 6 months ago) (1 children)

is this trying to say "discrimination against racists is the real racism"? ... Would that be "racismism"?

[–] [email protected] 11 points 6 months ago* (last edited 6 months ago)

this reads like someone googled a list of gen z slang and then threw it in a blender with a bunch of weird race-science memes. who is this for

I think the only acceptable response to whoever is responsible for it is a highly aggressive "touch grass"

[–] [email protected] 19 points 6 months ago (2 children)

I think they were responding to the implication in self's original comment that LLMs were claiming to evaluate code in-model and that calling out to an external python evaluator is 'cheating.' But actually as far as I know it is pretty common for them to evaluate code using an external interpreter. So I think the response was warranted here.

That said, that fact honestly makes this vulnerability even funnier because it means they are basically just letting the user dump whatever code they want into eval() as long as it's laundered by the LLM first, which is like a high-school level mistake.

[–] [email protected] 23 points 6 months ago (2 children)

Look, you gotta forgive this guy for coming up with an insane theory that doesn't make sense. After all, his brain was poisoned by testosterone, so his thinking skills have atrophied. An XXL hat size can only do so much, you know.

[–] [email protected] 17 points 6 months ago* (last edited 6 months ago) (1 children)

They need the rationalist musical cannons for the upcoming performance of the Rationalist 1812 Overture

view more: ‹ prev next ›