21
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 10 Mar 2025
21 points (100.0% liked)
TechTakes
1890 readers
66 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 2 years ago
MODERATORS
I actually like the argument here, and it's nice to see it framed in a new way that might avoid tripping the sneer detectors on people inside or on the edges of the bubble. It's like I've said several times here, machine learning and AI are legitimately very good at pattern recognition and reproduction, to the point where a lot of the problems (including the confabulations of LLMs) are based on identifying and reproducing the wrong pattern from the training data set rather than whatever aspect of the real world it was expected to derive from that data. But even granting that, there's a whole world of cognitive processes that can be imitated but not replicated by a pattern-reproducer. Given the industrial model of education we've introduced, a straight-A student is largely a really good pattern-reproducer, better than any extant LLM, while the sort of work that pushes the boundaries of science forward relies on entirely different processes.