1
Apple’s In-House Technologies Team Still Has Plenty of Work to Do
(www.bloomberg.com)
A place for Apple news, rumors, and discussions.
An obvious thing is if they want to beef up their neural engine to better handle AI stuff ? If they are going to add ‘on device’ LLM runtime support ?
LLM, emphasis on the Large. A standard LLM will take up A LOT of storage space to the point where the user won't have much left. Let alone the continuous processing power needed.
Siri + local "Small" Language Model. It could have full access to on device data, but more generalized quires are done remotely.
That does seem farther more likely, I would agree.