Warning: mysqli::__construct(): (HY000/1203): User howardkn already has more than 'max_user_connections' active connections in D:\Inetpub\vhosts\howardknight.net\al.howardknight.net\includes\artfuncs.php on line 21
Failed to connect to MySQL: (1203) User howardkn already has more than 'max_user_connections' active connections
Warning: mysqli::query(): Couldn't fetch mysqli in D:\Inetpub\vhosts\howardknight.net\al.howardknight.net\index.php on line 66
Article <v7rf50$56o$1@nnrp.usenet.blueworldhosting.com>
Deutsch   English   Français   Italiano  
<v7rf50$56o$1@nnrp.usenet.blueworldhosting.com>

View for Bookmarking (what is this?)
Look up another Usenet article

Path: ...!weretis.net!feeder9.news.weretis.net!usenet.blueworldhosting.com!diablo1.usenet.blueworldhosting.com!nnrp.usenet.blueworldhosting.com!.POSTED!not-for-mail
From: Andrew <andrew@spam.net>
Newsgroups: misc.phone.mobile.iphone
Subject: Re: Apple accused of underreporting suspected CSAM on its platforms
Date: Wed, 24 Jul 2024 17:54:41 -0000 (UTC)
Organization: BWH Usenet Archive (https://usenet.blueworldhosting.com)
Message-ID: <v7rf50$56o$1@nnrp.usenet.blueworldhosting.com>
References: <v7mup4$7vpf$1@solani.org> <v7o49q$16cpi$1@dont-email.me> <v7obdd$17fqi$2@dont-email.me> <v7olme$19gq3$1@dont-email.me> <lga95bF8qq0U4@mid.individual.net> <v7q92v$1l4i5$1@dont-email.me> <lgcl7oFbd6U1@mid.individual.net> <v7rda8$1r5n5$1@dont-email.me>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Injection-Date: Wed, 24 Jul 2024 17:54:41 -0000 (UTC)
Injection-Info: nnrp.usenet.blueworldhosting.com;
	logging-data="5336"; mail-complaints-to="usenet@blueworldhosting.com"
User-Agent: NewsTap/5.5 (iPad)
Cancel-Lock: sha1:wwcdId4zbbp+BbDD1DbpxdHOuqI= sha256:8zRawhCGrWITXEbBv6td3vIbRwJr5ZACrcG4gdxr9Cs=
	sha1:18gbZtr+EaMeGJn2xPiS208GLpI= sha256:iVwxM3Q7trIwgToRkNeqd1QLAg6OvlWuAOny/1xe72Q=
X-Face: VQ}*Ueh[4uTOa]Md([|$jb%rw~ksq}bzqA;z-.*8JM`4+zL[`N\ORHCI80}]}$]$e5]/i#v  qdYsE`yh@ZL3L{H:So{yN)b=AZJtpaP98ch_4W}
Bytes: 2494
Lines: 28

Chris wrote on Wed, 24 Jul 2024 17:23:20 -0000 (UTC) :

>> Tell that to the people whose private photos are scanned and are falsely
>> accused of a crime they didn't commit because an imperfect algorithm got
>> it wrong.
> 
> It's already happening. Is it better that 30m images were used to violate
> god knows how many people currently or a better method where a far tinier
> amount which are highly enriched for true positive?

Chris,

You can't make that assessment without fabricating the percentage of
convictions, which, let's be clear, is the most important metric of all.

The people who reported all this CSAM bullshit *know* that the percentage
of convictions is the most important metric without which everything is BS.

Given they didn't bother to report that metric, we must assume it's 0.

Hence, as far as we know, out of 30 million images reported, exactly zero
convictions resulted - which means every report was a false positive.

Think about that.

Everyone was harmed.
Nobody was protected.

It's a classic case of pure bullshit.