I think in the next couple years OS-shipped will replace the use of heavy cloud based . Microsoft, Google, and soon Apple will be shipping devices with local LLMs and it'll be cheaper for applications to target those APIs rather than pay OpenAI or the such. This will also mean that we'll get into a sort of "browser wars" of model functionality gated by hardware vendors.

Follow

I don't think cloud AI will fully go away but I think it'll make less and less sense for consumer facing use cases as the small models become more viable via better training and better hardware acceleration.

· · Web · 1 · 0 · 4

For example, Chrome is working on shipping web APIs for LLM access. I'm planning to release something similar in @agregore in the next week or two.

github.com/explainers-by-googl

Sign in to participate in the conversation
Mauvestodon

Escape ship from centralized social media run by Mauve.