🤖 v1.0.0-beta.15: Local LLMs!

You can now configure local LLM models in peersky://settings/llm
It comes with Qwen2.5-Coder 3B as the default model.
The APIs are currently available to apps such as peersky://p2p/editor/ and peersky://p2p/ai-chat/

Thanks to @agregore and @mauve for the support!

Docs: github.com/p2plabsxyz/peersky-

What’s next?
github.com/p2plabsxyz/peersky-

Follow

@lutindiscret @peersky @agregore That could be done via an extension pretty easily. Do you want a pre-selected set of languages to translate to or just a prompt to pop up for you to fill in?

· · Web · 2 · 0 · 1

@mauve @peersky @agregore sub menu in contextual menu. Translate to >

English
Français
Other

The last open a prompt to chose from all supported languages.

I guess the default listed in menu should be the one used in accept language header by default but maybe a config panel would be useful

@mauve @lutindiscret @peersky @agregore Yeah! will be shipping extensions support this month or Jan.

Pretty excited to work on P2P extensions store: github.com/p2plabsxyz/peersky-

Sign in to participate in the conversation
Mauvestodon

Escape ship from centralized social media run by Mauve.