this post was submitted on 09 Nov 2023
1 points (100.0% liked)

Apple

107 readers
2 users here now

A place for Apple news, rumors, and discussions.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] Silicon_Knight@alien.top 1 points 2 years ago (1 children)

X to doubt. Apple "usually" plans these changes in ahead so other devices can take advantage of it. Not saying that is the case here at all as there are times it did.

[โ€“] mamimapr@alien.top 1 points 2 years ago

ChatGPT is just 1 year old. And only recently has it become feasible to run LLMs locally on-device. Apple couldn't really have planned for it hardware-wise.

A common requirement for these LLMs and stable diffusion models is large memory requirement, which Apple has always been stingy with. Only with the newer hardware do I foresee them increasing base memory, if they want to make AI more accessible.