this post was submitted on 09 Dec 2024
30 points (100.0% liked)

TechTakes

1489 readers
79 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Semi-obligatory thanks to @dgerard for starting this.)

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 20 points 2 weeks ago (1 children)

just delivered a commission on why bitcoin is very like hawk tuah

this season's word is: kleptokakistocracy

load more comments (1 replies)
[–] [email protected] 19 points 1 week ago
[–] [email protected] 19 points 1 week ago* (last edited 1 week ago) (12 children)

Can we all take a moment to appreciate this absolutely wild take from Google's latest quantum press release (bolding mine) https://blog.google/technology/research/google-willow-quantum-chip/

Willow’s performance on this benchmark is astonishing: It performed a computation in under five minutes that would take one of today’s fastest supercomputers 10^25^ or 10 septillion years. If you want to write it out, it’s 10,000,000,000,000,000,000,000,000 years. This mind-boggling number exceeds known timescales in physics and vastly exceeds the age of the universe. It lends credence to the notion that quantum computation occurs in many parallel universes, in line with the idea that we live in a multiverse, a prediction first made by David Deutsch.

The more I think about it the stupider it gets. I'd love if someone with an actual physics background were to comment on it. But my layman take is it reads as nonsense to the point of being irresponsible scientific misinformation whether or not you believe in the many worlds interpretation.

[–] [email protected] 20 points 1 week ago* (last edited 1 week ago) (4 children)

"Quantum computation happens in parallel worlds simultaneously" is a lazy take trotted out by people who want to believe in parallel worlds. It is a bad mental image, because it gives the misleading impression that a quantum computer could speed up anything. But all the indications from the actual math are that quantum computers would be better at some tasks than at others. (If you want to use the names that CS people have invented for complexity classes, this imagery would lead you to think that quantum computers could whack any problem in EXPSPACE. But the actual complexity class for "problems efficiently solvable on a quantum computer", BQP, is known to be contained in PSPACE, which is strictly smaller than EXPSPACE.) It also completely obscures the very important point that some tasks look like they'd need a quantum computer — the program is written in quantum circuit language and all that — but a classical computer can actually do the job efficiently. Accepting the goofy pop-science/science-fiction imagery as truth would mean you'd never imagine the Gottesman–Knill theorem could be true.

To quote a paper by Andy Steane, one of the early contributors to quantum error correction:

The answer to the question ‘where does a quantum computer manage to perform its amazing computations?’ is, we conclude, ‘in the region of spacetime occupied by the quantum computer’.

load more comments (4 replies)
load more comments (11 replies)
[–] [email protected] 17 points 1 week ago* (last edited 1 week ago) (3 children)

your regular reminder that the guy with de facto ownership over the entire Rust ecosystem outside of the standard library and core is very proud about being in Peter Thiel’s pocket (and that post is in reference to this article)

e: on second thought I’m being unfair — he owns the conferences and the compiler spec process too

load more comments (3 replies)
[–] [email protected] 16 points 1 week ago

reddit just launched an LLM integrated into the site. hardly any point going through what garbage these things are at this point but of course it failed the first test I gave it

miscounting "r"s in "strawberry"

[–] [email protected] 16 points 1 week ago (9 children)

So it turns out the healthcare assassin has some.... boutique... views. (Yeah, I know, shocker.) Things he seems to be into:

  • Lab-grown meat
  • Modern architecture is rotten
  • Population decline is an existential threat
  • Elon Musk and Peter Thiel

How soon until someone finds his LessWrong profile?

[–] [email protected] 21 points 1 week ago (2 children)

the absolute state of american politics: rentseeker ceo gets popped by a libertarian

[–] [email protected] 17 points 1 week ago (3 children)

We should expect more of this to come. The ascendant right wing is pushing policies that only deliver for people who are already stinking rich. Even if 99% of those who vote that way go along with the propaganda line in the face of their own disappointment, that's still a lot of unhappy people, who are not known for intellectual consistency or calm self-reflection, in a country overflowing with guns. All it takes is one ammosexual who decides that his local Congressman has been co-opted by the (((globalists))), you know?

load more comments (3 replies)
load more comments (1 replies)
[–] [email protected] 14 points 1 week ago (28 children)

StRev was calling him TPOT adjacent and woodgrains was having a bit of a panic over it.

[–] [email protected] 20 points 1 week ago (4 children)

"Our righteous warriors are only supposed to kill brown people and women, not captains of industry!!"

[–] [email protected] 14 points 1 week ago

dust specks vs CEOs

load more comments (3 replies)
[–] [email protected] 14 points 1 week ago

It's so embarrassing to watch upper-middle class (at best!) rationalists get their panties in a twist over Luigi. At least the right wing talking heads are getting paid, these guys are just mad he did things instead of tweeting about things.

[–] [email protected] 13 points 1 week ago

"My heavens, our self-regarding supremacist ideology can't possibly imply violence... can it???"

load more comments (25 replies)
load more comments (7 replies)
[–] [email protected] 14 points 1 week ago* (last edited 1 week ago) (1 children)

On the third day of OpenAI my true ~~love~~ enemy gave to me ~~three french hens~~ Sora.

The version of Sora we are deploying has many limitations. It often generates unrealistic physics and struggles with complex actions over long durations.

"12 days of OpenAI" lol. Such marketing.

Big eye roll to this part too:

We’re introducing our video generation technology now to give society time to explore its possibilities and co-develop norms and safeguards that ensure it’s used responsibly as the field advances.

[–] [email protected] 10 points 1 week ago

It's half-assed on purpose, and that's a good thing! We swear!

Credit where it's due to the writer who somehow polished that filthy turd. You can almost believe it's the normal sort of deceptive marketing.

[–] [email protected] 14 points 1 week ago (2 children)

OK so we're getting into deep rat lore now? I'm so sorry for what I'm about to do to you. I hope one day you can forgive me.

LessWrong diaspora factions! :blobcat_ohno:

https://transmom.love/@elilla/113639471445651398

if I got something wrong, please don't tell me. gods I hope I got something wrong. "it's spreading disinformation" I hope I am

[–] [email protected] 10 points 1 week ago* (last edited 1 week ago) (6 children)

My pedantic notes, modified by some of my experiences, so bla bla epistemic status, colored by my experiences and beliefs take with grain of salt etc. Please don't take this as a correction, but just some of my notes and small minor things. As a general 'trick more people into watching into the abyss' guide it is a good post, mine is more an addition I guess.

SSC / The Motte: Scott Alexander's devotees. once characterised by interest in mental health and a relatively benign, but medicalised, attitude to queer and especially trans people. The focus has since metastasised into pseudoscientific white supremacy and antifeminism.

This is a bit wrong tbh, SSC always was anti-feminist. Scotts old (now deleted) livejournal writings, where he talks about larger discussion/conversation tactics in a broad meta way, the meditations on superweapons, always had the object level idea of attacking feminism. For example, using the wayback machine, the sixth meditation (this is the one I have bookmarked). He himself always seems to have had a bit of a love/hate relationship with his writings on anti-feminism and the fame and popularity this brought him.

The grey tribe bit is missing that guy who called himself grey tribe in I think it was silicon valley who wanted to team up with the red tribe to get rid of all the progressives, might be important to note because it looks like they are centrist, but shock horror, they team up with the right to do far right stuff.

I think the extropianists might even have different factions, like the one around Natasha Vita-More/Max More. But that is a bit more LW adjacent, and it more predates LW than it being a spinoff faction. (The extropian mailinglist came first iirc). Singularitarians and extropianists might be a bit closer together, Kurzweil wrote the singularity is near after all, which is the book all these folks seem to get their AI doom ideas from after all. (if you ever see a line made up out of S-curves that is from that book. Kurzweil also is an exception to all these people as he actually has achievements, he build machines for the blind, image recognition things, etc etc, he isn't just a writer. Nick Bostrom is also missing it seems, he is one of those X-risk guys, also missing is Robin Hanson, who created the great filter idea, the prediction markets thing, and his overcoming bias is a huge influence on Rationalism, and could be considered a less focused on science fiction ideas part of Rationalism, but that was all a bit more 2013 (Check the 2013 map of the world of Dark Enlightenment on the Rationalwiki Neoreaction page).

"the Protestants to the rationalists' Catholicism" I lolled.

Note that a large part of sneerclubbers is (was) not ex rationalists, nor people who were initially interested in it, it actually started on reddit because badphil got too many rationalists suggestions that they created a spinoff. (At least so the story goes) so it was started by people who actually had some philosophy training. (That also makes us the most academic faction!)

Another minor thing in long list of minor things, might also be useful to mention that Rationalwiki has nothing to do with these people and is more aligned with the sneerclub side.

There are also so many Scotts. Anyway, this post grew a bit out of my control sorry for that, hope it doesn't come off to badly, and do note that my additions make a short post way longer so prob are not that useful. Don't think any of your post was misinformation btw (I do think that several of these factions wouldn't call themselves part of LW, and there is a bit of a question who influenced who (the More's seem to be outside of all this for example, and a lot of extropians predate it etc etc. But that kind of nitpicking is for people who want to write books on these people).

E: reading the thread, this is a good post and good to keep in mind btw. I would add not just what you mentioned but also mocking people for personal tragedy, as some people end/lose their lives due to rationalism, or have MH episodes, and we should be careful to treat those topics well. Which we mostly try to do I think.

load more comments (6 replies)
load more comments (1 replies)
[–] [email protected] 14 points 1 week ago (1 children)

That assassin in New York has turned out to be a EA hang around.

[–] [email protected] 25 points 1 week ago* (last edited 1 week ago)

The whole internet loves CEO Murderer, a handsome murderer that targets CEOs! 5 seconds later We regret to inform you the murderer is into AI

[–] [email protected] 14 points 1 week ago (1 children)

the grok AI is now available to free twitter users, evidently not enough paying users were interested

it's somewhat more tedious than Gemini and that's saying something

load more comments (1 replies)
[–] [email protected] 14 points 1 week ago* (last edited 1 week ago) (1 children)

Lots of developments in the UHC shooter case

  • They (probably) caught him

  • He's even hotter without his three identical jackets

  • He's a Thielhead

Not very happy with how this day's going

https://xcancel.com/pepmangione

[–] [email protected] 12 points 1 week ago* (last edited 1 week ago)

He was radicalized by Tim Urban, an into AGI

[–] [email protected] 12 points 1 week ago* (last edited 1 week ago) (1 children)

Adam Christopher comments on a story in Publishers Weekly.

Says the CEO of HarperCollins on AI:

"One idea is a “talking book,” where a book sits atop a large language model, allowing readers to converse with an AI facsimile of its author."

Please, just make it stop, somebody.

Robert Evans adds,

there's a pretty good short story idea in some publisher offering an AI facsimile of Harlan Ellison that then tortures its readers to death

Kevin Kruse observes,

I guess this means that HarperCollins is getting out of the business of publishing actual books by actual people, because no one worth a damn is ever going to sign a contract to publish with an outfit with this much fucking contempt for its authors.

load more comments (1 replies)
[–] [email protected] 12 points 1 week ago (1 children)
load more comments (1 replies)
[–] [email protected] 12 points 1 week ago (10 children)

I ended up on Know Your Meme because folks are throwing around TPOT like we're all terminally online and all understand Twitter lingo.

For anyone else confused: that part of Twitter. It's techbro "phillosophists" like Peter Thiel.

[–] [email protected] 13 points 1 week ago (1 children)

yeah, TPOT is specifically the uwu smol bean neoreactionary race scientists

[–] [email protected] 11 points 1 week ago (1 children)

I ended up on Know Your Meme because folks are throwing around "uwu" like we’re all terminally online and all understand Twitter lingo.

[–] [email protected] 15 points 1 week ago (8 children)

looking up uwu on know your meme "uwu," what is this

load more comments (8 replies)
load more comments (9 replies)
[–] [email protected] 11 points 1 week ago* (last edited 1 week ago) (4 children)

Someone made a... thing where you run all your mastodon toots through a prompt before posting them and thought it would be a great idea to let us lemmy users know. Comes with ugly autoplag image on the github. The names of the "bots" are hilarious: "ennui" for low-effort posts; I don't think that word means what they think it means. "Legion" for automatic spam posts. Cause that's not a dogwhistle or anything.

load more comments (4 replies)
[–] [email protected] 11 points 1 week ago (6 children)

Saw something about "sentiment analysis" in text. While writers have discussed "death of the author" and philosophers and linguists have discussed what it even means to derive meaning from text, these fucking AI dorks are looking at text in a vacuum and concluding "this text expresses anger".

print("I'm angry!")

the above python script is angry, look at my baby skynet

[–] [email protected] 10 points 1 week ago (4 children)

Openai are you angry? Yes -> it is angry. No -> it is being sneaky, and angry.

load more comments (4 replies)
load more comments (5 replies)
load more comments
view more: next ›