Cool, I guess I now have hands on experience getting LLMs to interact with the internet and system resources in case folks want to hire me to do stuff like that.

Fully offline and locally with open source models without high end hardware or GPUs.

@fleeky using open sourcr language models and a "harness" which lets the model invoke external functions which then pipe results back into it's context and resume generation

Sign in to participate in the conversation
Mauvestodon

Escape ship from centralized social media run by Mauve.