I think my philosophy when making software is that it should work for people with zero money or no bank account / credit card.
I know it's not a popular mindset to be in since money and profit is everything in the tech world.
I think it comes from growing up as a kid with no disposable income or access to anything but my shitty computer.
I'd rather support people with almost nothing than people with latest and greatest tech gizmos and spare cash for subscription services. 😅
Lmao the AI in The Amazing Digital Circus is canonically written in #lisp
I saw a post recently wherein someone used LLM tools to analyze someone else’s software, which eventually led them to a conclusion that was essentially completely wrong. Not only that, the LLM drew conclusions about the *authors* behind the code that were borderline character assassination. Nevertheless, this person posted this output as though it were some kind of deep insight.
These LLM outputs are not independent thoughts. The LLM probably ingested hints of (maybe unconscious) biases in the user’s prompts within its context window, and regurgitated something that confirmed those biases. The user was pleased that their biases were confirmed (Independently! By an impartial LLM!), and they posted the output, maybe as vindication of their insight.
These models’ sycophancy can be subtle. They don’t have to state “You’re absolutely right!” to blow smoke up your ass. Sometimes they seem to confirm your preconceived notion after they supposedly “evaluate” information “independently”.
How (racist/sexist/whatever) harassment on Mastodon works:
1. Harasser replies to their target's post, with the reply set to "followers only", saying the most vile stuff you can imagine.
2. All the harasser's followers join in on the harassment, posting more vile stuff.
3. Nobody but the target and the harassment crew can see the vile stuff that was said.
4. Target is traumatized. Nobody else can see why.
5. Everybody says "I don't see it so it's not happening."
@ai6yr "The major cost of most corporations is labor."
*HARD* disagree.
The cost that they perceive as "disposable" is labor. The highest actual cost, by far, is the people at the top, most of whom do practically nothing (or even make willfully bad decisions like integrating fake "AI" into systems that shouldn't be using a LLM and will ultimately present a huge loss later) while being paid many many millions...
And those are the positions most disposable and that a LLM could fake best...
WHAT
ARE YOU INSANE, GOOGLE
CAN WE PLEASE START SUPPORTING ALTERNATIVES LIKE SAILFISH OS PLEASE
#google #android #privacy #opensource #sailfishos #jollaphone #enshittification
Occult Enby that's making local-first software with peer to peer protocols, mesh networks, and the web.
Yap with me and send me cool links relating to my interests. 👍