Path: ...!eternal-september.org!feeder3.eternal-september.org!news.eternal-september.org!.POSTED!not-for-mail From: legg Newsgroups: sci.electronics.design Subject: Re: Chinese downloads overloading my website Date: Sun, 10 Mar 2024 13:47:48 -0400 Organization: A noiseless patient Spider Lines: 77 Message-ID: References: <7qujui58fjds1isls4ohpcnp5d7dt20ggk@4ax.com> <6lekuihu1heui4th3ogtnqk9ph8msobmj3@4ax.com> MIME-Version: 1.0 Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit Injection-Info: dont-email.me; posting-host="6b3a8df777a17fe1b6174ba9839775e0"; logging-data="3250460"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX19ne2qnvsqYeOrS0gLHFH73" Cancel-Lock: sha1:hVQEJifbEM4EL4Lbfaou0aI9x/c= X-Newsreader: Forte Agent 4.2/32.1118 Bytes: 3864 On Sun, 10 Mar 2024 06:08:15 GMT, Jan Panteltje wrote: >On a sunny day (Sat, 09 Mar 2024 20:59:19 -0500) it happened legg > wrote in : > >>On Fri, 08 Mar 2024 06:43:49 GMT, Jan Panteltje >>wrote: >> >>>On a sunny day (Thu, 07 Mar 2024 17:12:27 -0500) it happened legg >>> wrote in <6lekuihu1heui4th3ogtnqk9ph8msobmj3@4ax.com>: >>> >>>>A quick response from the ISP says they're blocking >>>>the three hosts and 'monitoring the situatio'. >>>> >>>>All the downloading was occuring between certain >>>>hours of the day in sequence - first one host >>>>between 11 and 12pm. one days rest, then the >>>>second host at the same timeon the third day, >>>>then the third host on the fourth day. >>>> >>>>Same files 262 times each, 17Gb each. >>>> >>>>Not normal web activity, as I know it. >>>> >>>>RL >>> >>>Many sites have a 'I m not a bot' sort of thing you have to go through to get access. >> >> >>Any idea what's involved - preferably anything that doesn't >>owe to Google? >>... >>I'd like to limit traffic data volume by any host to <500M, >>or <50M in 24hrs. It's all ftp. > >I no longer run an ftp server (for many years now), >the old one here needed a password. >Some parts of my website used to be password protected. >When I ask google for "how to add a captcha to your website" >I see many solutions, for example this: > https://www.oodlestechnologies.com/blogs/create-a-captcha-validation-in-html-and-javascript/ > >Maybe some html guru here nows? That looks like it's good for accessing an html page. So far the chinese are accessing the top level index, where files are offered for download at a click. Ideally, if they can't access the top level, a direct address access to the files might be prevented? The website's down after a fifth excursion pushed volumes above 85g on a 70G temporary extension. What's the bet it was 17G accumulated in 262 'visits'. Can't ID that final hosts IP address while I'm locked out. Luckily (~) for users, you can still access most of the usefull files, updated in January 2024, through the Wayback Machine. https://web.archive.org/web/20240000000000*/http://www.ve3ute.ca/ Probably the best place for it, in some people's opinion, anyways. YOU can make stuff available to others, in the future, by 'suggesting' relevent site addresses to the Internet Archive, if they're not already being covered. Once a 'captcha' or other security device is added, you can kiss Wayback updates goodbye, as most bots will get the message. I don't mind bots - thay can do good work. Pity you can't just put stuff up in the public domain without this kind of bullshit. RL