@BenHouston3D IS the claude dependency due to specific features or is there a chance we could use it with OpenAI compatible APIs like ollama or deepseek?
@mauve Have you tried this tool? Did you get any usable results?
@BenHouston3D I use local models via ollama exclusively :P Usually I assume an OpenAI compatible backend and allow users to specify a URL/API key.
@BenHouston3D I use it with aider for example. It's a billion time dumber than claude but it can do small tasks entirely offline which I think is good.
@mauve I did write a LLM agnostic backend initially but I removed it as Claude just seemed so good that the complexity and offer of choice didn't seem worth it.
But if there is demand, I can add it back in.