🤖 v1.0.0-beta.15: Local LLMs!

You can now configure local LLM models in peersky://settings/llm
It comes with Qwen2.5-Coder 3B as the default model.
The APIs are currently available to apps such as peersky://p2p/editor/ and peersky://p2p/ai-chat/

Thanks to @agregore and @mauve for the support!

Docs: github.com/p2plabsxyz/peersky-

What’s next?
github.com/p2plabsxyz/peersky-

Follow

@lutindiscret @peersky @agregore That could be done via an extension pretty easily. Do you want a pre-selected set of languages to translate to or just a prompt to pop up for you to fill in?

· · Web · 0 · 0 · 0
Sign in to participate in the conversation
Mauvestodon

Escape ship from centralized social media run by Mauve.