The demand is real. People have seen what an unrestricted personal digital assistant can do.
The demand is real. People have seen what crack cocaine can do.
The demand is real. People have seen what an unrestricted personal digital assistant can do.
The demand is real. People have seen what crack cocaine can do.
Dang, I want to find this article more relatable than I do. Most software I have dev experience with doesn't have the problem of relying on automated tests too much, but the exact opposite.
And while I very much write tests for the dopamine high and false sense of security green checkmarks provide, I still prefer that to the real sense of un-security of not having tests.
Context: Post #4 in my sequence of private Lightcone Infrastructure memos edited for public consumption.
(emphasis mine)
Look at this goober, trying to be THE cult leader.
the model was supposed to be trained solely on his own art
much simpler models are practically impossible to train without an existing model to build upon. With GenAI it's safe to assume that training that base model included large scale scraping without consent
"The Camp of the Saints is a 1973 French dystopian fiction novel by author and explorer Jean Raspail. A speculative fictional account, it depicts the destruction of Western civilization through Third World mass immigration to France and the Western world."
More of a train whistle than a dog whistle this one.
personally i think that Vietnamese invasion on all ea compounds would solve a lots of problems and disband a couple of startups
But aren't they used to dealing with VC?
Man I don't need to be reminded of the sorry state of meat alternatives.
It's bitterly funny to me that fashoid governments started banning cultivated meat as if the economic and technical issues weren't enough. Ignorants terrified of threats they made up in their head as always.
The promptfans testing OpenAI Sora have gotten mad that it's happening to them and (temporarily) leaked access to the API.
https://techcrunch.com/2024/11/26/artists-appears-to-have-leaked-access-to-openais-sora/
“Hundreds of artists provide unpaid labor through bug testing, feedback and experimental work for the [Sora early access] program for a $150B valued [sic] company,” the group, which calls itself “Sora PR Puppets,” wrote in a post ...
"Well, they didn't compensate actual artists, but surely they will compensate us."
“This early access program appears to be less about creative expression and critique, and more about PR and advertisement.”
OK, I could give them the benefit of the doubt: maybe they're new to the GenAI space, or general ML Space ... or IT.
But I'm not going to. Of course it's about PR hype.
That article gave me a whiplash. First part: pretty cool. Second part: deeply questionable.
For example these two paragraphs from sections 'problem with code' and 'magic of data':
“Modular and interpretable code” sounds great until you are staring at 100 modules with 100,000 lines of code each and someone is asking you to interpret it.
Regardless of how complicated your program’s behavior is, if you write it as a neural network, the program remains interpretable. To know what your neural network actually does, just read the dataset
Well, "just read the dataset bro" sound great sounds great until you are staring at a dataset with 100 000 examples and someone is asking you to interpret it.
I am neither left nor right wing, as I’m a libertarian
Ah, yes, the classic "I'm not like the other girls" of politics.
Automattic... that's why there are two t's!? Jesus Christ.
I'm curious though how much vibe coding and AI mandates are responsible for the latest disaster of a patch. Or if they are all just results of disfunctional company culture.