this post was submitted on 09 Nov 2023
4 points (100.0% liked)

TechTakes

1432 readers
110 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
 

nitter archive

just in case you haven't done your daily eye stretches yet, here's a workout challenge! remember to count your reps, and to take a break between paragraphs! duet your score!

oh and, uh.. you may want to hide any loose keyboards before you read this. because you may find yourself wanting to throw something.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 3 points 1 year ago* (last edited 1 year ago) (1 children)

tfw your drunken friday night shellscripts are someone else's whole startup model[0]

it is astounding just how convoluted and self-twisted these people manage to make their arguments though. like, this thing. literally billions of investment over time, multiple decamillions (if not more) in actual in-the-rack compute, however fucking much else you have in supporting infra (power, IP, routing, and all the other cumulative layer cake of goog's investment in that over time), all as actual fucking computers that are probably already running virtual machines to run the containers for running your model code... and then you get this fucking kind of remark out of it

it takes some incredibly creative point-dodging

[0] - unironically I've had this feel a non-zero amount of times

[–] [email protected] 3 points 1 year ago (1 children)

it’s fucking bizarre to see this many supposed computer science experts forget how much of the field is built on complexity theory and optimization, since things get fucking weird (simulate an entire alternate universe weird, like we’ve seen a lot of these AI fools turn to) when you stop analyzing complexity

of course my FPGA obsession lately has been designing a lambda calculus reducer that runs on hardware, a ridiculously inefficient (in terms of both cycles and bytes) way to do general computation. I don’t claim that work’s practical though, unlike these breathless AI fuckers pretending an LLM doing inefficient computation (on top of a multitude of layers of efficient computation) is revolutionary

[–] [email protected] 3 points 1 year ago

@self @froztbyte Crypto Dorks, Elon Stans, AI True Believers. All of these fuckers desperately wanting the object of their devotion to be Actually Useful