I think in the next couple years OS-shipped #LocalAI will replace the use of heavy cloud based #AI. Microsoft, Google, and soon Apple will be shipping devices with local LLMs and it'll be cheaper for applications to target those APIs rather than pay OpenAI or the such. This will also mean that we'll get into a sort of "browser wars" of model functionality gated by hardware vendors.
For example, Chrome is working on shipping web APIs for LLM access. I'm planning to release something similar in @agregore in the next week or two.
https://github.com/explainers-by-googlers/prompt-api/blob/main/chrome-implementation-differences.md
@hermeticvm @agregore Oh snap. Is their api stable? Have you tried it out?
@hermeticvm @agregore Ohhh I see. This is for the built in LLM UI they have. I am working on JavaScript APIs for web apps to have access to.
@mauve Brave already supports custom ollama endpoints already.
Quite cool.
@agregore