I'm considering getting myself a new computer to replace my #SteamDeck.
It seems the GPD Win 4 finally has decent #linux support via #ChimeraOS and I saw someone on Reddit got the GSM module working so I could potentially replace my phone with it. The smaller form factor wouls be great for when I'm walking around witg my head mounted display.
@skryking I defs don't think they're "smart" in the human sense. But the auto-complete on steroids can do some funky stuff if you set it up just right. It's like using language to program. Unlike programming instead of doing exactly what you want it does sorta kinda what you might want. It's amazing that it's more than useless though.
@staltz @nonlinear @viticci @mixmix Hell yes. Been waiting for this to materialize for a while. V excited to fork Firefox or Chromium or Webkit to make an iOS version of @agregore
@skryking For me it was more that I can finally make this stuff work related and potentially find clients to pay me to mess with it. :P Sadly my hand pain makes computer touching less appealing off the clock.
@skryking Nice. I've been wanting to get into Rust for years but didn't have much of a use case. Now with the candle library from HuggingFace and my latest adventures with LLMs I've had an actual reason to write something in it. :) https://github.com/huggingface/candle/
@lutindiscret I'm putting together a Matrix one here: https://matrix.to/#/#userless-agents:mauve.moe
@skryking This post by @simon is what exposed me to the idea for the first time: https://til.simonwillison.net/llms/python-react-pattern
I also have a slightly improved prompt here: https://gist.github.com/RangerMauve/19be7dca9ced8e1095ed2e00608ded5e
I'll likely be publishing any new work as open source on Github. :) Probably with Rust.
@jonny I liked Dunkey's coverage of the gameplay :P https://www.youtube.com/watch?v=buRSN13jH3E
@jonny Are you gonna play the pokemon with guns game?
@skryking it has less innate knowledge of facts but it is pretty good at "reasoning". I'm gonna teach it to make function calls and traverse datasets + summarize stuff. 😁
@skryking Nice. I only do CPU workloads. Try running phi 2 some time! It's super low in resurce usage. Particually the Q4 quantized models.
@laskov Oh yeah, I read their release but haven't used it yet. Was there anything specific they excelled at?
@staltz I've already got the programming socks. :P
@skryking What have you been using to run the models? I find LM Studio really nice for tinkering. https://lmstudio.ai/
I find Q4 quantized models work pretty well on my steam deck.
Hmm, after testing the raw Phi 2 within LM Studio instead of the examples provided by HuggingFace candle, I think it's actually pretty decent after all.
Specifically I got phi 2 Q4_K_S gguf working from TheBloke.
Can't get that model running with candle since it can't seem to load the model weight.
Occult Enby that's making local-first software with peer to peer protocols, mesh networks, and the web.
Exploring what a local-first cyberspace might look like in my spare time.