this post was submitted on 09 Nov 2023
1 points (100.0% liked)
Apple
111 readers
1 users here now
A place for Apple news, rumors, and discussions.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
They mean the 16 pro, right? Because in all likelihood, the 16 will use the A17 Pro or something equivalent.
I'm also not confident that a phone chip will be strong enough to run a LLM like ChatGPT within the next 5 years. I am aware of LLMs that can run on phones right now, but they're slow and they're incapable of tasks that require a moderate level of thought. I'm not optimistic about the capabilities of these exclusive features, if they are coming.
Why not just use chatgpt in browser
Latency? Privacy? Integration with on-device data? Server costs?