this post was submitted on 18 Nov 2023
20 points (95.5% liked)

Technology

966 readers
55 users here now

A tech news sub for communists

founded 2 years ago
MODERATORS
 

Their charter: https://openai.com/charter

OpenAI is the company behind ChatGPT among other AI products. I try to keep myself out of the loop when it comes to AI because I end up hearing about it anyway so I wasn't aware of this charter.

For the unaware, AGI stands for Artificial General Intelligence. It basically means a form of AI that is extremely advanced and general-purpose like human intelligence is. For contrast, ChatGPT and Stable Diffusion (for example) are highly specialised. The former generates text response to text input and the latter generates images in response to text input.

Despite both these AI technologies of today being very impressive (even if their proprietors try to obscure the training and energy cost), the path to achieving AGI is pretty much inconceivable at present. Current AI technologies may have exploratory value in achieving AGI in some far future. But AGI is most likely not going to be built upon currently existing technologies and is going to be a different beast altogether provided it exists in the first place.

Given this, I find it absolutely baffling that OpenAI is talking about AGI like they do. This is the same level of delusion as Elon Musk talking about Mars colonisation. But given that techbros see themselves as the stewards for the next step in civilisational evolution, I guess it should come as no surprise that they eat this shit up uncritically.

I'm not sure what role these generative AIs will play in the near future. I am trying to figure out whether they will primarily be sold to corporations to cut labour cost or to end users to boost productivity. But talking of AGI and AI singularity and far fetched shit like that is a pure marketing stunt.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 1 year ago (2 children)

The problem with current AI models is the training time / energy use.

As computers become more powerful and specialist hardware starts to mature training on the fly will become a thing and this will be a massive step towards AGI.

Our brain has different regions that all specialise in different things and I expect AGI will be similar. Things like ChatGPT could be laying the foundation for the speech centers.

[–] [email protected] 8 points 1 year ago (1 children)

It could but it also could not. It's too early to say and it's too up in the air despite how impressive the technology is. That's why I think it's very arrogant on their part to talk like they are gonna be the ones to usher in AGI.

[–] [email protected] -1 points 1 year ago

Every bit of research helps with these things.

ChatGPT is a pretty massive breakthrough on its own as well.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago) (2 children)

Moore's law (and apparently now Moore himself lol) has been dead for a while and cores can't really get that much faster due to power dissipation. On the other hand there are some really harsh physical or theoretical limits to parallelisms (besides the points in that article, I've heard it's incredibly slow to have memory busses for the same memory in more than 16 cores). I wouldn't place any bet that we're going to get any more computing power than what we have.

The solution (to most of computing, not just AI) right now is to roll up the sleeves and start writing actually efficient software because we can't expect that the next generation of Intel processors will necessarily make our optimizations redundant anymore.

That includes developing (possibly slightly worse but) more compute-efficient ML models, but the big AI boys are allergic to that because they rely on funding or are directly owned by the biggest cloud server providers.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago)

They are looking at analogue chips as a potential solution.

https://www.newscientist.com/article/2388005-analogue-chips-can-slash-the-energy-used-to-run-ai-models/

I think minting chips for that specific AI model rather than running on a generic chip will be a potential solution.

[–] [email protected] 2 points 1 year ago

start writing actually efficient software

We should be writing efficient software regardless.