A new report on Sunday again reiterates that Apple's AI push in iOS 18 is rumored to focus on privacy with processing done directly on the iPhone, that won't connect to cloud services.
That’s what they all say. But a lot of these so called AI features require power more than what a phone has. Offloading to a server is sometimes a must.
Quantised models can be surprisingly small. And if Apple aren’t targeting LLMs for local use, more specific/tailored models absolutely can run on device.
That said, given the precedent sent by Siri, their next progression of Siri into an LLM will absolutely require network connection and be executed server side.
Sure if you’re running large models like gpt, smaller models tailored to specific use cases can absolutely run on phones. Whether or not they get there implementation down right is a different story though
That’s what they all say. But a lot of these so called AI features require power more than what a phone has. Offloading to a server is sometimes a must.
Quantised models can be surprisingly small. And if Apple aren’t targeting LLMs for local use, more specific/tailored models absolutely can run on device.
That said, given the precedent sent by Siri, their next progression of Siri into an LLM will absolutely require network connection and be executed server side.
deleted by creator
Sure if you’re running large models like gpt, smaller models tailored to specific use cases can absolutely run on phones. Whether or not they get there implementation down right is a different story though