Warning: mysqli::__construct(): (HY000/1203): User howardkn already has more than 'max_user_connections' active connections in D:\Inetpub\vhosts\howardknight.net\al.howardknight.net\includes\artfuncs.php on line 21
Failed to connect to MySQL: (1203) User howardkn already has more than 'max_user_connections' active connections
Warning: mysqli::query(): Couldn't fetch mysqli in D:\Inetpub\vhosts\howardknight.net\al.howardknight.net\index.php on line 66
Article <v7rda8$1r5n5$1@dont-email.me>
Deutsch   English   Français   Italiano  
<v7rda8$1r5n5$1@dont-email.me>

View for Bookmarking (what is this?)
Look up another Usenet article

Path: ...!weretis.net!feeder9.news.weretis.net!2.eu.feeder.erje.net!feeder.erje.net!eternal-september.org!feeder3.eternal-september.org!news.eternal-september.org!.POSTED!not-for-mail
From: Chris <ithinkiam@gmail.com>
Newsgroups: misc.phone.mobile.iphone
Subject: Re: Apple accused of underreporting suspected CSAM on its platforms
Date: Wed, 24 Jul 2024 17:23:20 -0000 (UTC)
Organization: A noiseless patient Spider
Lines: 37
Message-ID: <v7rda8$1r5n5$1@dont-email.me>
References: <v7mup4$7vpf$1@solani.org>
 <v7o49q$16cpi$1@dont-email.me>
 <v7obdd$17fqi$2@dont-email.me>
 <v7olme$19gq3$1@dont-email.me>
 <lga95bF8qq0U4@mid.individual.net>
 <v7q92v$1l4i5$1@dont-email.me>
 <lgcl7oFbd6U1@mid.individual.net>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Injection-Date: Wed, 24 Jul 2024 19:23:21 +0200 (CEST)
Injection-Info: dont-email.me; posting-host="41645c29659ca468712d490fbdfbd01c";
	logging-data="1939173"; mail-complaints-to="abuse@eternal-september.org";	posting-account="U2FsdGVkX18eZ8/Ht6qWY09QCGPjXym8Wb3OJxMTmiA="
User-Agent: NewsTap/5.5 (iPhone/iPod Touch)
Cancel-Lock: sha1:ew1mEhZi47VOyPp9NdJ98B0cd3I=
	sha1:PsmS77zMaU126mT2P8R5YMu3iMo=
Bytes: 2399

Jolly Roger <jollyroger@pobox.com> wrote:
> On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:
>> Jolly Roger <jollyroger@pobox.com> wrote:
>>> 
>>> Apple's proposal was to match personal photos with *known* CSAM
>>> images.
>> 
>> Correct. 
>> 
>>> It would do nothing to detect *new* CSAM images. 
>> 
>> Also correct. 
>> 
>>> And it could not prevent false positive matches. 
>> 
>> Incorrect.
> 
> Nope. I am correct. It absolutely could not prevent false matches. 
> 
>> It is designed to avoid false positives, although nothing is 100%
>> perfect. 
> 
> If it has even .1 % false matches, then someone's privacy has been
> violated.
> 
>>> Everyone on this planet should have a right to basic privacy. 
>> 
>> And they do.
> 
> Tell that to the people whose private photos are scanned and are falsely
> accused of a crime they didn't commit because an imperfect algorithm got
> it wrong.

It's already happening. Is it better that 30m images were used to violate
god knows how many people currently or a better method where a far tinier
amount which are highly enriched for true positive?