this post was submitted on 19 Nov 2023
1 points (100.0% liked)

Apple

67 readers
2 users here now

A place for Apple news, rumors, and discussions.

founded 11 months ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 10 months ago (1 children)

An obvious thing is if they want to beef up their neural engine to better handle AI stuff ? If they are going to add ‘on device’ LLM runtime support ?

[–] [email protected] 1 points 10 months ago (1 children)

LLM, emphasis on the Large. A standard LLM will take up A LOT of storage space to the point where the user won't have much left. Let alone the continuous processing power needed.

Siri + local "Small" Language Model. It could have full access to on device data, but more generalized quires are done remotely.

[–] [email protected] 1 points 10 months ago

That does seem farther more likely, I would agree.