17
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 09 Feb 2026
17 points (94.7% liked)
TechTakes
2442 readers
68 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 2 years ago
MODERATORS
News story from 2015:
The letter was issued by the Future of Life Institute which is now Max Tegmark and Toby Walsh's organization.
People have worked on the general pop culture that inspired TESCREAL, and on the current hype, but less on earlier attempts to present machine minds as a clear and present danger. This has the 'arms race' narrative, the 'research ban' proposed solution, but focuses on smaller dangers.
Oh hey. I remember this. I was confused at the time how it seemed to almost come out of left field, and how some of the names ended up on the same letter.
Now I recognise all those names from the Epstein files, although some were only mentions rather than direct participants.
lmao the shade
If you follow world politics, it has been obvious that Noam Chomsky is a useful idiot since the 1990s and probably the 1970s. I wish he had learned from the Khmer Rouge that not everyone who the NYT says is a bad guy is a good guy!
Oh absolutely. It's frankly shocking how wrong he's been about so many things for so so long. He's also managed to pen the most astonishingly holocaust-denial-coded diatribe I've ever read from (ostensibly) a non-holocaust denier. I guess his overdeveloped genocide-denial muscle was twitching!
The point about heavy artillery is actually pretty salient, though a more thorough examination would also note that "Lethal Autonomous Weapons Systems" is a category that includes goddamn land mines. Of course this would serve to ground the discussion in reality and is thus far less interesting to people who start organizations like the Future of Life Institute.
I'm pretty sure LAWS exist right now, even without counting landmines. Automatic human targeting and friend/foe distinction aren't exactly cutting edge technologies.
The biggest joke to me is that these systems are somewhat cost-efficient on the scale of a Kalashnikov. Ukraine is investing heavily into all kinds of drones, but that is because they're trying to be casualty-efficient. And it's all operator based. No-one wants the 2M€ treaded land-drone to randomly open fire on a barn and expose its position to a circling 5k€ kamikaze drone.