this post was submitted on 16 Sep 2023
13 points (100.0% liked)

Technology

12 readers
1 users here now

This magazine is dedicated to discussions on the latest developments, trends, and innovations in the world of technology. Whether you are a tech enthusiast, a developer, or simply curious about the latest gadgets and software, this is the place for you. Here you can share your knowledge, ask questions, and engage in discussions on topics such as artificial intelligence, robotics, cloud computing, cybersecurity, and more. From the impact of technology on society to the ethical considerations of new technologies, this category covers a wide range of topics related to technology. Join the conversation and let's explore the ever-evolving world of technology together!

founded 1 year ago
top 7 comments
sorted by: hot top controversial new old
[–] [email protected] 3 points 1 year ago (2 children)

You heard of ram in a processor? Now you've got processor in a ram.

[–] [email protected] 2 points 1 year ago (1 children)

"You got ram in my processor" "You got processor in my ram"

[–] [email protected] 1 points 1 year ago

Yo dawg, I heard you like ram with your processors, so we put processors with your ram so you can process your ram much faster and have more ram

[–] [email protected] 1 points 1 year ago

Shit, soon we'll have to put ram in the processor in the ram, which will need its own processor to go in the ram in the processor in the ram...

[–] [email protected] 2 points 1 year ago

Needs RAM radiator with big fan

[–] [email protected] 1 points 1 year ago

So glue like 8gb of ram to the inside of the CPU package... nifty

[–] [email protected] 1 points 1 year ago

Yup, seems like the move. Anything we can do to offload work from loading/sifting through models will greatly increase their efficiency. Currently MythicAI has worked on using analog chips to work with models and they've succeeded, however they're pretty specific to their use case (MythicAI is for AI recognition in realtime video; surveillance basically). But it should be fairly easily adapted to any type of model, it simply needs to be popularized.

Analog looks really useful since it's for highly specific mathematics, but if we could get a system module for RAM in addition to an analog processor, theoretically the need VRAM usage for AI could plummet. We're only using Tensor cores to brute force it, currently we could load a model, have it's information delivered, that's 400watts for the duration... mythicAI's analog chips use only 3.5 watts, delivering the results more quickly.

Anyway, the future is looking promising. Current implementations of AI are mediocre but the biggest hurdle they've had is the sheer amount of energy it takes. The benefits of it go up immensely if the energy cost goes down, and the quality of AI is only going to get better from here. Rather than shun the technology or abhor it, we probably should be looking at ways to embrace it without needing its own electric grid.