@jonmsterling Hard agree regarding just getting the prompt
@atax1a That's reasonable! I don't mean to say that everyone should do as I want. Only that I appreciate when folks CW stuff.
@atax1a I love it when people CW US pol so I can get a break from the constant doom posting if I want. Setting up word filters for all the various actors and euphamisms is exhausting. I'd rather not have to block/mute people entirely
Sometimes I almost miss having a regular laptop but it's been like a decade since those days and the controller setup suits me way better
hot take, burning hot take, like, not to you who probably agree but to the companies who disagree: not only should it be easy and legal to jailbreak all your devices ... the "jail" shouldn't even exist, you should be able to install and uninstall any software without jailbreaking the device
this includes phones, tablets, consoles, smart appliances, and obviously computers
@burnitdown @xgranade Also for repairability and upgrades the @frameworkcomputer computers are decently priced and so easy to use https://frame.work/ca/en/laptop13
@burnitdown @xgranade @system76 Less than a new macbook but I agree it'd be better if folks reused hardware. I think businesses would have the highest impact rather than household consumers.
It is strange to read an article detailing exactly why and how Microsoft has lost its way, leading to products that really just aren't good any more, all the way through to academic critiques of Microsoft's AI push (though, I'll note, the article goes out of its way to not mention Emily Bender).... only to read in the closing lines that the author is completely unwilling to even try Linux.
@burnitdown @xgranade IMO the safest path for regular users is to buy a preloaded computer from one of the many vendors out there like @system76 No need to do anything fancy and it comes preloaded with an approachable desktop environment.
Thinking of ditching Ollama and running Llama.cpp directly in a systemd user service. It's great for tinkering but it adds overhead and can be slower to adopt new llama.cpp features. Thanks to Arch linux it's easy to just build it from github using this AUR package.
@django Yeah 😅 Been finding them all over the place after we made a core part of the codebase that used to be sync into an async worker. Glad we have tests to catch this stuff!
17. These lil digivice things seem perfect for my local assistant tinkering. Maybe once I get better hardware for inference or get streaming working locally.
https://www.seeedstudio.com/SenseCAP-Watcher-W1-A-p-5979.html
Occult Enby that's making local-first software with peer to peer protocols, mesh networks, and the web.
Yap with me and send me cool links relating to my interests. 👍