question for y'all. We all know that AI is really fuckin' grody and will hammer your site from a variety of IP addresses, like it's some kind of malware.

I know there are projects out there to trap AI attempting to scrape a site, but do any of you know if there is a way to allow list particular user-agents, like say, the internet archive?

I can probably disallow access to the directory on a user-agent basis. Hm...

Sign in to participate in the conversation
Mauvestodon

Escape ship from centralized social media run by Mauve.