🤖 v1.0.0-beta.15: Local LLMs!
You can now configure local LLM models in peersky://settings/llm
It comes with Qwen2.5-Coder 3B as the default model.
The APIs are currently available to apps such as peersky://p2p/editor/ and peersky://p2p/ai-chat/
Thanks to @agregore and @mauve for the support!
Docs: https://github.com/p2plabsxyz/peersky-browser/blob/main/docs/LLM.md
What’s next?
https://github.com/p2plabsxyz/peersky-browser/issues/97
@lutindiscret @peersky @agregore That could be done via an extension pretty easily. Do you want a pre-selected set of languages to translate to or just a prompt to pop up for you to fill in?