Warning: mysqli::__construct(): (HY000/1203): User howardkn already has more than 'max_user_connections' active connections in D:\Inetpub\vhosts\howardknight.net\al.howardknight.net\includes\artfuncs.php on line 21
Failed to connect to MySQL: (1203) User howardkn already has more than 'max_user_connections' active connections
Warning: mysqli::query(): Couldn't fetch mysqli in D:\Inetpub\vhosts\howardknight.net\al.howardknight.net\index.php on line 66
Article <ctucnePm--3qiNj7nZ2dnZfqnPWdnZ2d@giganews.com>
Deutsch   English   Français   Italiano  
<ctucnePm--3qiNj7nZ2dnZfqnPWdnZ2d@giganews.com>

View for Bookmarking (what is this?)
Look up another Usenet article

Path: ...!Xl.tags.giganews.com!local-1.nntp.ord.giganews.com!news.giganews.com.POSTED!not-for-mail
NNTP-Posting-Date: Wed, 15 May 2024 20:05:11 +0000
Subject: Re: Are these the only 3 known updated Usenet archives left?
Newsgroups: news.admin.peering
References: <v10bch$1pqq$1@nnrp.usenet.blueworldhosting.com>
 <66409b65$0$24768$882e4bbb@reader.netnews.com>
 <v216ke$j4d$1@nnrp.usenet.blueworldhosting.com>
From: Ross Finlayson <ross.a.finlayson@gmail.com>
Date: Wed, 15 May 2024 13:05:13 -0700
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:38.0) Gecko/20100101
 Thunderbird/38.6.0
MIME-Version: 1.0
In-Reply-To: <v216ke$j4d$1@nnrp.usenet.blueworldhosting.com>
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Transfer-Encoding: 7bit
Message-ID: <ctucnePm--3qiNj7nZ2dnZfqnPWdnZ2d@giganews.com>
Lines: 74
X-Usenet-Provider: http://www.giganews.com
X-Trace: sv3-eUr4VBgHg4DQaFv0XXkBCWrSEPjAOH6llNd5EIH0WRW1P4laZ2C/Up2d/txYo19LX/ThTY4Xuipl190!zXv4K9Mq2nzbPEaHUem00FE1kTNZ+N1TAY0wtV6D1El4HZf0NVRc+Pk8xK4AGZIb7JaXjsVii3rE
X-Complaints-To: abuse@giganews.com
X-DMCA-Notifications: http://www.giganews.com/info/dmca.html
X-Abuse-and-DMCA-Info: Please be sure to forward a copy of ALL headers
X-Abuse-and-DMCA-Info: Otherwise we will be unable to process your complaint properly
X-Postfilter: 1.3.40
Bytes: 4363

On 05/14/2024 07:28 PM, Andrew wrote:
> Billy G. (go-while) wrote on Sun, 12 May 2024 13:23:47 +0200 :
>
>> On 02.05.24 17:27, Andrew wrote:
>>> Are these the only 3 known updated Usenet archives left?
>>>
>>>    <https://news.admin.peering.narkive.com/>
>>>    <https://cmacleod.me.uk/ng/news.admin.peering>
>>>    <https://www.novabbs.com/computers/search.php?group=news.admin.peering>
>>>    <https://groups.google.com/g/news.admin.peering> No updates after 22Feb24
>>
>> i've imported and deduped all available usenet backups/mbox files from
>> archive.org (several TB compressed) without any filtering and accepted
>> every group that showed up which results in 471k groups so far at the
>> Full-Node.
>>
>> 'news.software.nntp' for example dates back to 1987.
>>
>> it should be readable via NNTP
>>
>> Part-Node (111k groups)
>> file: http://104.244.74.85/usenet/active/part.active.txt
>> host: 104.244.74.85:11119
>> user: freefree
>> pass: freefree
>>
>> Full-Node (471k groups)
>> file: http://104.244.74.85/usenet/active/full.active.txt
>> host: 104.244.74.85:11120
>> user: freefree
>> pass: freefree
>
> Thanks for pitching in to help the team as the one thing Google Groups was
> good for was that it was an updated web based search engine which didn't
> require special tools, retention rules, accounts, or a newsreader.
>
> All you needed was a browser and anyone could be sent a URL which they
> could read on their browser even if they couldn't even spell Usenet.
>
> However... I'm confused by your post.
>
> For a layperson such as I am, how would I use it to look up a post from,
> oh, say, yesterday in news.admin.peering?
>

I'm curious whether the HTTP source mentioned supports range-requests,
and whether HEAD for those resources results also the size, with
regards to making a HEAD request to get the size then making a plan
to download these files which would be expected to be constant now
in batches of range-requests and so on.

It would be very much and greatly appreciated if these are indeed
archives of pretty much all text usenet since the land before time.


About how to search these is you break them out into whatever then
makes for a summary of these files, like a pyramidal sort of
organization, then as with regards generating summary which these
days seems the "inverse-document-frequency" pattern as much as
otherwise summary and links to document IDs, to, search or query
for documents of a kind and result message-ID's their relevant
documents, or "hits".

If these are really the thing for something like "Archive Any and
All Text Usenet" it would be pretty great with regards to these,
and some of the other Internet Archive and other archives mentioned
over the past few months as after Google quit Usenet (one imagines
it was a bit too interesting to its latest/greatest knowledge gobble).


Well then warm regards and I shall so hope that such a resource,
as this portends to be, finds a good and fair usage. If so, good show.