LLM, AI
Been firguring out how to integrate an LLM into my flow. So far I'm thinking of using one of these LLAMA based models running locally. I tried one and it kinda sucks for code generation but at least I can run it entirely locally on my steam deck.
My UX flow is going to focus on using selected text in an X11 session. I'm using nerd-dictation to give prompts to the model, fetch "input" from selected text, then have the genertated "output" fed back over the selected text.