1
submitted 2 years ago by [email protected] to c/[email protected]
you are viewing a single comment's thread
view the rest of the comments
[-] [email protected] 1 points 2 years ago

An obvious thing is if they want to beef up their neural engine to better handle AI stuff ? If they are going to add ‘on device’ LLM runtime support ?

[-] [email protected] 1 points 2 years ago

LLM, emphasis on the Large. A standard LLM will take up A LOT of storage space to the point where the user won't have much left. Let alone the continuous processing power needed.

Siri + local "Small" Language Model. It could have full access to on device data, but more generalized quires are done remotely.

[-] [email protected] 1 points 2 years ago

That does seem farther more likely, I would agree.

this post was submitted on 19 Nov 2023
1 points (100.0% liked)

Apple

107 readers
2 users here now

A place for Apple news, rumors, and discussions.

founded 2 years ago
MODERATORS