25
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 16 Feb 2026
25 points (90.3% liked)
TechTakes
2559 readers
31 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 2 years ago
MODERATORS
Tante.cc writes about Cory using an 'Drunk Uncle' style argument to defend his LLM usage (and go after the left using strawmans).
(To counter one of Cory's arguments, If disliking LLMs was just about the people who run it, people against it would have have stayed in sneerclub).
That was a good read.
Corey doc wrote:
Equivocating what LLMs do and what goes into LLM web scraping with "a search engine" is messed up. His article that he links about scraping is mostly about how badly copyright works and how analysing trade-secret-walled data can be beneficial both to consumers and science but occasionally bad for citizen privacy, which you'll recognize as mostly irrelevant to the concerns people tend to have against LLM training data providers ddosing the fuck out of everything, and all the rest of the stuff tante does a good job of explaining.
Corey also provides this anecdote:
what the actual shit
edit: I mean, he tried transformer powered voice-to-text and liked it, and now he's all in on the LLMs are a rigorous and accurate tool actually bandwagon?
Also the web scraping article is from 2023 but CD linked it in the recent pluralistic post so I assume his views haven't changed.
This one hurts. Maybe CD can be brought back around but oof.
I the post he keeps referring to Ollama as an LLM (it's a desktop app that runs a local server that lets you download and interface with a local LLM via CLI or http API) so it's possible he's just that far behind in his technical understanding of LLMs that he's fallen to taking the wrong people's word for it.
The post certainly reads like he doesn't even know which local LLM he's using, let alone what it takes to make one.