Warning: mysqli::__construct(): (HY000/1203): User howardkn already has more than 'max_user_connections' active connections in D:\Inetpub\vhosts\howardknight.net\al.howardknight.net\includes\artfuncs.php on line 21
Failed to connect to MySQL: (1203) User howardkn already has more than 'max_user_connections' active connectionsPath: ...!weretis.net!feeder9.news.weretis.net!news.nk.ca!rocksolid2!i2pn2.org!.POSTED!not-for-mail
From: fir
Newsgroups: comp.lang.c
Subject: Re: program to remove duplicates
Date: Sun, 22 Sep 2024 14:46:12 +0200
Organization: i2pn2 (i2pn.org)
Message-ID: <66F01194.5030706@grunge.pl>
References: <8630bec343aec589a6cdc42bb19dae28120ceabf@i2pn2.org> <66EF8293.30803@grunge.pl> <66EFF046.8010709@grunge.pl>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Info: i2pn2.org;
logging-data="3000139"; mail-complaints-to="usenet@i2pn2.org";
posting-account="+ydHcGjgSeBt3Wz3WTfKefUptpAWaXduqfw5xdfsuS0";
User-Agent: Mozilla/5.0 (Windows NT 5.1; rv:27.0) Gecko/20100101 Firefox/27.0 SeaMonkey/2.24
To: Bart
In-Reply-To:
X-Spam-Checker-Version: SpamAssassin 4.0.0
Bytes: 2649
Lines: 34
Bart wrote:
> On 22/09/2024 11:24, fir wrote:
>> Paul wrote:
>
>>> The normal way to do this, is do a hash check on the
>>> files and compare the hash. You can use MD5SUM, SHA1SUM, SHA256SUM,
>>> as a means to compare two files. If you want to be picky about
>>> it, stick with SHA256SUM.
>
>
>> the code i posted work ok, and if someone has windows and mingw/tdm
>> may compiel it and check the application if wants
>>
>> hashing is not necessary imo though probably could speed things up -
>> im not strongly convinced that the probablility of misteke in this
>> hashing is strictly zero (as i dont ever used this and would need to
>> produce my own hashing probably).. probably its mathematically proven
>> ists almost zero but as for now at least it is more interesting for me
>> if the cde i posted is ok
>
> I was going to post similar ideas (doing a linear pass working out
> checksums for each file, sorting the list by checksum and size, then
> candidates for a byte-by-byte comparison, if you want to do that, will
> be grouped together).
>
> But if you're going to reject everyone's suggestions in favour of your
> own already working solution, then I wonder why you bothered posting.
>
> (I didn't post after all because I knew it would be futile.)
>
>
i wanta discus nt to do enything that is mentioned .. it is hard to
understand? so i may read on options but literally got no time to
implement even good idead - thsi program i wrote showed to work and im
now using it