Warning: mysqli::__construct(): (HY000/1203): User howardkn already has more than 'max_user_connections' active connections in D:\Inetpub\vhosts\howardknight.net\al.howardknight.net\includes\artfuncs.php on line 21
Failed to connect to MySQL: (1203) User howardkn already has more than 'max_user_connections' active connections
Warning: mysqli::query(): Couldn't fetch mysqli in D:\Inetpub\vhosts\howardknight.net\al.howardknight.net\index.php on line 66
Article <v85qi7$iro$1@dont-email.me>
Deutsch   English   Français   Italiano  
<v85qi7$iro$1@dont-email.me>

View for Bookmarking (what is this?)
Look up another Usenet article

Path: ...!eternal-september.org!feeder3.eternal-september.org!news.eternal-september.org!.POSTED!not-for-mail
From: Chris <ithinkiam@gmail.com>
Newsgroups: misc.phone.mobile.iphone
Subject: Re: Apple accused of underreporting suspected CSAM on its platforms
Date: Sun, 28 Jul 2024 16:10:47 -0000 (UTC)
Organization: A noiseless patient Spider
Lines: 47
Message-ID: <v85qi7$iro$1@dont-email.me>
References: <v7mup4$7vpf$1@solani.org>
 <v7o49q$16cpi$1@dont-email.me>
 <v7obdd$17fqi$2@dont-email.me>
 <v7olme$19gq3$1@dont-email.me>
 <lga95bF8qq0U4@mid.individual.net>
 <v7q92v$1l4i5$1@dont-email.me>
 <lgcl7oFbd6U1@mid.individual.net>
 <v7rda8$1r5n5$1@dont-email.me>
 <v7rf50$56o$1@nnrp.usenet.blueworldhosting.com>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Injection-Date: Sun, 28 Jul 2024 18:10:47 +0200 (CEST)
Injection-Info: dont-email.me; posting-host="12793de90456d67421d6281200f5c82d";
	logging-data="19320"; mail-complaints-to="abuse@eternal-september.org";	posting-account="U2FsdGVkX19HBCtdwVmkMWzXESH3aEntzb+CK6se5Ro="
User-Agent: NewsTap/5.5 (iPhone/iPod Touch)
Cancel-Lock: sha1:eSuwZRjRFj+HYZv781ONdSLJDTs=
	sha1:kI5vPjqZAB2HvKL3I0Q5Mcgj6ro=
Bytes: 3148

Andrew <andrew@spam.net> wrote:
> Chris wrote on Wed, 24 Jul 2024 17:23:20 -0000 (UTC) :
> 
>>> Tell that to the people whose private photos are scanned and are falsely
>>> accused of a crime they didn't commit because an imperfect algorithm got
>>> it wrong.
>> 
>> It's already happening. Is it better that 30m images were used to violate
>> god knows how many people currently or a better method where a far tinier
>> amount which are highly enriched for true positive?
> 
> Chris,
> 
> You can't make that assessment without fabricating the percentage of
> convictions, which, let's be clear, is the most important metric of all.

It is the end game, obviously, but it is not the most important metric. As
you've been banging the drum about for forever, people's privacy is
important so need to avoid as many false positives as possible. Which is
what Apple tried to achieve with their novel method. 

> The people who reported all this CSAM bullshit *know* that the percentage
> of convictions is the most important metric without which everything is BS.

The NSPCC are a UK charity and are not party to that data. They do not know
that information. 

> Given they didn't bother to report that metric, we must assume it's 0.

False assumption. We know there are many convictions and almost all involve
digital images which have collected and/or shared via the interwebs. 

> Hence, as far as we know, out of 30 million images reported, exactly zero
> convictions resulted - which means every report was a false positive.
> 
> Think about that.

Apple did and came up with a better solution, however people like you, who
would never be impacted, grabbed their tinfoil hats and screamed "foul". 

> Everyone was harmed.
> Nobody was protected.
> 
> It's a classic case of pure bullshit.

Just like you spout 99% of the time.