26
submitted 3 months ago by [email protected] to c/[email protected]

Framework just announced their Desktop computer: an AI powerhorse?

Recently I've seen a couple of people online trying to use Mac Studio (or clusters of Mac Studio) to run big AI models since their GPU can directly access the RAM. To me it seemed an interesting idea, but the price of a Mac studio make it just a fun experiment rather than a viable option I would ever try.

Now, Framework just announced their Desktop compurer with the Ryzen Max+ 395 and up to 128GB of shared RAM (of which up to 110GB can be used by the iGPU on Linux), and it can be bought for something slightly below €3k which is far less than the over €4k of the Mac Studio for apparently similar specs (and a better OS for AI tasks)

What do you think about it?

top 13 comments
sorted by: hot top new old
[-] [email protected] 10 points 3 months ago

I think it has potential but I would like to see benchmarks to determine how much. The fact that they have 5Gbps Ethernet and TB4 (or was it 5?) is also interesting for clusters.

[-] [email protected] 2 points 3 months ago

Indeed!

(It should be TB4 if I remember correctly)

[-] [email protected] 8 points 3 months ago

Memory bandwidth is 256GB/sec, much less than M4 Max (526GB/s) or M2 Ultra (800GB/s). Expect performance to reflect that.

[-] [email protected] 8 points 3 months ago

It's comparable to the M4 Pro in memory bandwidth but has way more RAM for the price.

[-] [email protected] 5 points 3 months ago

Good point. You can't even get an M* Pro with 128GB. Only the Max and Ultra lines go that high, and then you'll end up spending at least twice as much.

[-] [email protected] 7 points 3 months ago* (last edited 3 months ago)

It's absolutely positioned to be a cheap AI PC. Mac Studio started gaining popularity due to its shared RAM, Nvidia responded with their home server thing, and now AMD responds with this.

It being way cheaper and potentially faster is huge. the bad news is that it's probably going to be scalped and out of stock for some time.

[-] [email protected] 4 points 3 months ago
[-] [email protected] 3 points 3 months ago* (last edited 3 months ago)

'Wonder how it will compare to NVIDIA Digits on price

Update: Depending on specs, it's priced similar it seems - approx. 3000 EUR for a decent complete prosumer build.

Update 2: Holy boaty, it's modular and supports third-party hardware: https://frame.work/dk/en/products/desktop-mainboard-amd-ai-max300?v=FRAMBM0006

[-] [email protected] 3 points 3 months ago

Perhaps I'm mistaken but I read this as "Nvidia takes 3k just for the chip" and framework "2,5k for the whole system"?

Either way exciting news for self hoster in the next years!

[-] [email protected] 2 points 3 months ago

Awesome! We need benchmarks ASAP!

[-] [email protected] 2 points 3 months ago

How is this preferable to just building your own itx machine?

I understand the appeal of a framework laptop, where getting a chassis and mainboard with extended hardware support is otherwise impossible. But how is this better than the existing options?

[-] [email protected] 2 points 3 months ago

Probably it is equivalente to a custom itx machine, the thing I was impressed about was the CPU they used that combined with fast RAM could be very good for hosting locale LLMs

[-] [email protected] 1 points 2 months ago

Yeah, pretty good price.

this post was submitted on 26 Feb 2025
26 points (88.2% liked)

LocalLLaMA

3111 readers
33 users here now

Welcome to LocalLLaMA! Here we discuss running and developing machine learning models at home. Lets explore cutting edge open source neural network technology together.

Get support from the community! Ask questions, share prompts, discuss benchmarks, get hyped at the latest and greatest model releases! Enjoy talking about our awesome hobby.

As ambassadors of the self-hosting machine learning community, we strive to support each other and share our enthusiasm in a positive constructive way.

Rules:

No harassment or personal character attacks of community members. I.E no namecalling, no generalizing entire groups of people that make up our community, no baseless personal insults.

No comparing artificial intelligence/machine learning models to cryptocurrency. I.E no comparing the usefulness of models to that of NFTs, no comparing the resource usage required to train a model is anything close to maintaining a blockchain/ mining for crypto, no implying its just a fad/bubble that will leave people with nothing of value when it burst.

No comparing artificial intelligence/machine learning to simple text prediction algorithms. I.E statements such as "llms are basically just simple text predictions like what your phone keyboard autocorrect uses, and they're still using the same algorithms since <over 10 years ago>.

No implying that models are devoid of purpose or potential for enriching peoples lives.

founded 2 years ago
MODERATORS