I'm glad someone else was able to coherently discuss how ass-backwards Saltman's response has been. Like, if anything the fact that he responds to this moment by talking up the importance of democracy over emerging technologies should just be evidence before some distant future revolutionary tribunal that he knows his company is literally Sauron (okay, maybe more the Witch-King of Angmar than Sauron) and doesn't care because he wants to be the one wearing the ring at the end of the day.

From the second post:

A seasoned security leader would never build a defensive program and then measure offensive capability only, making remediation a second-class story. That is the kind of dog and pony show that any good security initiative would slam the door on. Or it’s like a surgeon telling you they have an even sharper scalpel to cut you deeper and faster. Yeah, so then what?

Dark and paranoid thought: given that Anthropic very recently ran into issues with their defense contracts, are they playing up their offensive capabilities targeting a notoriously tech- and security-illiterate political establishment to try and force their way back into those sweet government contracts as an impossible-to-ignore offensive tool? I mean we've talked about how the cash burn rate for all these companies is sufficiently absurd that it's going to take something truly crazy to turn these companies self-sustaining before the world runs out of investor money, and military and intelligence budgets are notorious for dragging ludicrous amounts of public money into a dark alley where nobody can see what's happening to it.

Now that's not fair. It's based on a third-derivative of Advanced Homework: The Game.

Psychoanalysis really does seem to push the most obnoxious boundary in academic language. On one hand, it is legitimately valuable to create a specific framework to enable experts to talk about technical elements of the field. It reminds me of the old IT rant about users who think "turn on the computer" means "turn the screen on, no need to touch the actual computer part". But at the extreme it creates opacity for its own sake and makes it hard for people who haven't devoted their careers to the field to understand what's being done. Particularly in a medical or psychiatric field where the patient is by definition in a lower-information group than the person treating them, this amounts to making it hard for the patient to understand (and therefore consent) to what is being done to them. I am by no means immune to the simple pleasure of knowing something that other people don't, especially when the outside world reaffirms the value of that knowledge, and there is definitely a place for the specificity that this kind of jargon enables, but psychoanalysis seems to consistently stretch it too far.

[-] YourNetworkIsHaunted@awful.systems 7 points 2 days ago* (last edited 2 days ago)

To distract us from the ongoing cycle of violence and discourse about violence that neither cracks down or addresses it's causes, may I offer the fruit of today's YouTube rabbit hole:

AI isn't the future. It's medieval alchemy.

This feels somehow tied to the whole "agentic" thing I've ranged about previously. Like, individual acts of violence are strictly destructive because the people doing it aren't sufficiently "agentic" to change things, even though American history is full of cases where (usually racist) vigilante violence had a huge impact on people's decision-making. But when the government does it it's different because people in government got there by proving their agency and ability to actually impact the world. Like, it feels almost like he's offended that the NPCs might try and do something as drastic as killing someone without GM permission.

Meanwhile in reality, people legitimately do feel like they don't have a lot of options to protect themselves from the real harms this industry is doing, to say nothing of the people who buy his line about the oncoming class-K end-of-life scenario. Anger is an appropriate response to the circumstances we find ourselves in, and in a nation that has been quietly cultivating a culture of heroic violence for decades we shouldn't be surprised to see people trying to inflict that fear and rage upon the outside world.

Maybe it's just because I'm rolling back through Age of Mythology, but I died laughing at "it's like the centaur, Helen"

It's a willful refusal to actually consider the consequences of their beliefs, which is deeply ironic for a bunch that pride themselves on their hardcore consequentialism. Like, even if you just mean "if anyone builds it, everyone dies" as a simple cause and effect, that should imply some kind of action unless you don't think everyone dying would be bad actually.

I've never understood how these things are simultaneously gaining their abilities based on statistical analysis of all kinds of random writings online including social media, fanfic, reddit, etc. but also are simultaneously supposed to end up as experts rather than a much faster and more agreeable dumbass. Like, the training data may include all the great works of literature, all the scrapable scientific studies and textbooks they could steal, and so on. But it also included every moron who ever shared conspiracy theories on Twitter, every confident-sounding business idiot on LinkedIn, and every stupid word that Scott or Yud ever wrote. Surely the bullshit has to exceed the expertise by raw volume, and if they took the time and energy to curate it out the way they would need to to correct that they wouldn't be left with a large enough sample to actually scale off of.

Basically, either I'm dramatically misunderstanding something or the best we can hope for is the Average Joe on Reddit, who may not be a complete dumbass but definitely isn't a team of PhDs.

You know, I had let time blindness eat the context that Saltman has been saying the same doomer critihype about basically every product they've developed going back years. Like, I knew it but I hadn't really processed it.

I just... I have no words for how dumb literally everyone - apparently including myself - is that he keeps getting away with this shit.

The decision theory stuff itself ought to be called out more for playing pretty fast and loose with reality to begin with. "If you have a supercomputer that perfectly simulates blah blah blah" is such a fundamentally bad premise because once you presume such a thing exists you're committing to the same basic metaphysical problems that you would if you replaced the computer with God. In particular I think it commits you to hard determinism at which point there's no sense arguing about what the right action is because the answer was set in stone not just before you entered the room but when the initial state of the universe was set up. Like, there's a version of this where the question is meaningful in which case the premise is impossible, and a version where we accept the premise as given and render the question pointless. Why are you doing decision theory in a hypothetical world where nobody really makes decisions?

Or we could acknowledge that yudkowskian decision theory is just singularity apologetics and accept the impossible elements of the premise on faith.

17

Apparently we get a shout-out? Sharing this brings me no joy, and I am sorry for inflicting it upon you.

2

I don't have much to add here, but I know when she started writing about the specifics of what Democrats are worried about being targeted for their "political views" my mind immediately jumped to members of my family who are gender non-conforming or trans. Of course, the more specific you get about any of those concerns the easier it is to see that crypto doesn't actually solve the problem and in fact makes it much worse.

view more: next ›

YourNetworkIsHaunted

0 post score
0 comment score
joined 2 years ago