🤖 v1.0.0-beta.15: Local LLMs!
You can now configure local LLM models in peersky://settings/llm
It comes with Qwen2.5-Coder 3B as the default model.
The APIs are currently available to apps such as peersky://p2p/editor/ and peersky://p2p/ai-chat/
Thanks to @agregore and @mauve for the support!
Docs: https://github.com/p2plabsxyz/peersky-browser/blob/main/docs/LLM.md
What’s next?
https://github.com/p2plabsxyz/peersky-browser/issues/97
@lutindiscret @peersky @agregore That could be done via an extension pretty easily. Do you want a pre-selected set of languages to translate to or just a prompt to pop up for you to fill in?
@mauve @lutindiscret @peersky @agregore Yeah! will be shipping extensions support this month or Jan.
Pretty excited to work on P2P extensions store: https://github.com/p2plabsxyz/peersky-browser/issues/42
@mauve @peersky @agregore sub menu in contextual menu. Translate to >
English
Français
Other
The last open a prompt to chose from all supported languages.
I guess the default listed in menu should be the one used in accept language header by default but maybe a config panel would be useful