Message-ID: <67e477e8@news.ausics.net> From: not@telling.you.invalid (Computer Nerd Kev) Subject: Re: bad bot behavior Newsgroups: comp.misc References: <20250318182006.00006ae3@dne3.net> User-Agent: tin/2.0.1-20111224 ("Achenvoir") (UNIX) (Linux/2.4.31 (i586)) NNTP-Posting-Host: news.ausics.net Date: 27 Mar 2025 07:55:52 +1000 Organization: Ausics - https://newsgroups.ausics.net Lines: 24 X-Complaints: abuse@ausics.net Path: ...!weretis.net!feeder9.news.weretis.net!news.bbs.nz!news.ausics.net!not-for-mail Bytes: 1818 D Finnigan wrote: > https://tech.slashdot.org/story/25/03/26/016244/open-source-devs-say-ai-crawlers-dominate-traffic-forcing-blocks-on-entire-countries > > They're abusing everyone. If you have a web site, don't allow it to be > abused this way. On the other hand I run websites on the cheapest VPSs available and they have no load problem without any robots.txt rules to block bots let alone active blocking. Yet solutions like the "Anubis" proof-of-work thing mentioned in the link will require Javascript which blocks the JS-less web browsers I like to use for browsing other people's websites (and FF with NoScript too unless I decide to allow the random JS). So basically those websites are making their slow code on the server _my_ problem by forcing me to do bot-tests which I fail (sometimes in Firefox even with NoScript disabled too!). Don't abuse _users_ that way, just to block bots from your too-slow website! -- __ __ #_ < |\| |< _#