Where does it get the URLs from?
Some of the URLs must be coming from external sources, because I have requests like: /tales/27-years-of-linux/ringe/disobligingly/agronomy/downweigh/toxicohemia/ultrarefined/. The first two parts are valid, real pages, but I never served them to this particular bot. So it must have seen a reference to it elsewhere, and entered the maze there. Most crawled URLs have parts that are made up from my wordlist, so... I guess those are coming from previous scans.
In any case, this strongly suggests that at least some of the bots keep track of URLs seen, and that's going to be very wasteful for them.
Another win!
Wo Fie by Angel Maxine is still fire.
1. Guarantee a right to repair for appliances by requiring manufacturers to provide manuals and replacement parts;
2. Establish a 15% tax credit for appliance repairs, up to a maximum of $500 per year;
3. Inform consumers about the environmental impacts of household appliances so that they can make informed choices; and
4. Modify Canada’s Copyright Act to remove the legal barriers to repairing digital devices.
Petition e-5245 - Petitions
https://www.ourcommons.ca/petitions/en/Petition/Details?Petition=e-5245
typical dev: i'm gonna wire it up. dynamo db to ec2 to lambdas to nextjs to react to redux to graphql, and it'll only cost me $30+/mo to host this website
me: *living in the walls of michaelsoft, publishing web apps that cost nothing to run and nothing to use, because github forgot that github pages exists*
Attention Citizens:
The concept of "time off" is misleading.
What you think of as leisure is, in fact, mandatory recreation for overall increased productivity.
Failure to achieve maximum relaxation during relaxation hours is considered treasonous laziness.
Thank you for your compulsory leisure.
#Paranoia #TTRPG #games #relaxation #mandatoryPTO #lazy #rpg #PSA
Be suspicious of anything that requires a cloud service to operate where the server software is not self-hostable. Especially one that costs over $1000 dollarydoos.
I run into trouble every time I try to use generative AI for something other than generating a vague outline or summarizing some content or translating from some text to another format of text. I just don't think it'll save us from doing the thinking and processing stuff that humans are best suited for. Else you'd spend more time trying to direct it than you would doing it yourself (assuming you could do it yourself I guess?) outside a few small tasks.
Maybe that's a good thing though?
Ok, a question for #database #dbms nerds. Many years ago I worked on a join order optimizer that used a heuristic formula to estimate the number of rows produced by an equijoin, based on the number of rows in the two tables being joined and the number of distinct values in the join keys. This formula came from an academic paper, but I can't remember what the formula was or find the paper. Does this ring a bell for anyone? Boosts appreciated...
Allcaps
STOP DOING UPDATES
• SOFTWARE WAS NOT SUPPOSED TO CHANGE
• YEARS OF PROGRAMMING yet NO REAL-WORLD USE FOUND for PATCHES
• Wanted your software to change anyway for a laugh? We had a tool for that: It was called "INSTALLING SOMETHING NEW"
• "Yes please give me A DIFFERENT UI for no reason. Please LOSE ALL MY SETTINGS and re-enable all the telemetry." - Statements dreamed up by the utterly Deranged
Occult Enby that's making local-first software with peer to peer protocols, mesh networks, and the web.
Exploring what a local-first cyberspace might look like in my spare time.