Warning: mysqli::__construct(): (HY000/1203): User howardkn already has more than 'max_user_connections' active connections in D:\Inetpub\vhosts\howardknight.net\al.howardknight.net\includes\artfuncs.php on line 21
Failed to connect to MySQL: (1203) User howardkn already has more than 'max_user_connections' active connections
Warning: mysqli::query(): Couldn't fetch mysqli in D:\Inetpub\vhosts\howardknight.net\al.howardknight.net\index.php on line 66
Article <lgclvsFbd6U3@mid.individual.net>
Deutsch   English   Français   Italiano  
<lgclvsFbd6U3@mid.individual.net>

View for Bookmarking (what is this?)
Look up another Usenet article

Path: ...!fu-berlin.de!uni-berlin.de!individual.net!not-for-mail
From: Jolly Roger <jollyroger@pobox.com>
Newsgroups: misc.phone.mobile.iphone
Subject: Re: Apple accused of underreporting suspected CSAM on its platforms
Date: 24 Jul 2024 15:47:08 GMT
Organization: People for the Ethical Treatment of Pirates
Lines: 63
Message-ID: <lgclvsFbd6U3@mid.individual.net>
References: <v7mup4$7vpf$1@solani.org> <lg8ea1Fa94U1@mid.individual.net>
 <xn0oonlp4azqw16000@reader443.eternal-september.org>
 <lga2k1F7uk8U1@mid.individual.net>
 <xn0oonrftb7hazk002@reader443.eternal-september.org>
 <v7olut$19iie$1@dont-email.me> <lga8vfF8qq0U3@mid.individual.net>
 <v7q9vj$1l9co$1@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=utf-8
Content-Transfer-Encoding: 8bit
X-Trace: individual.net t7GGWF8dHiC5fTXyRgNbZgMMX0agta3Mxf5cuJrFTgavjBoggT
Cancel-Lock: sha1:qrqscMZqcMtg9GSr69pEnLwqqeQ= sha256:TdAwQaqMCjjP3VBCZcGrrn19+723BNG/pCXEObfNCrc=
Mail-Copies-To: nobody
X-Face: _.g>n!a$f3/H3jA]>9pN55*5<`}Tud57>1<n@LQ!aZ7vLO_nWbK~@T'XIS0,oAJcU.qLM
 dk/j8Udo?O"o9B9Jyx+ez2:B<nx(k3EdHnTvB]'eoVaR495,Rv~/vPa[e^JI+^h5Zk*i`Q;ezqDW<
 ZFs6kmAJWZjOH\8[$$7jm,Ogw3C_%QM'|H6nygNGhhl+@}n30Nz(^vWo@h>Y%b|b-Y~()~\t,LZ3e
 up1/bO{=-)
User-Agent: slrn/1.0.3 (Darwin)
Bytes: 4017

On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:
> Jolly Roger <jollyroger@pobox.com> wrote:
>> 
>> True. Unlike others, Apple's proposal was to only scan images on
>> device that were being uploaded to Apple's servers, and only match
>> hashes of them to a database of hashes matching known CSAM images.
>> And only after multiple matches reached a threshold would Apple
>> investigate further.
>
> All correct. 
>
>> Yet even with those precautions, there was still a realistic chance
>> of false positives 
>
> The rate was deterministic and tunable. 

If the rate was anything other than ZERO, them people's privacy was at
risk.

>> and invasion of privacy, which is why they scrapped the proposal.
>
> No.

Yes.

> They scrapped it because it wasn't worth pursuing. As a business it
> was of no benefit to them and the noisy reaction was enough to put
> them off.  There wasn't any "invasion of privacy". At least no more
> than there currently is in the US. 

Incorrect. Apple's statement makes it clear that their decision to scrap
CSAM scanning was based on the feedback they received from security and
privacy professionals:

---
“After extensive consultation with experts to gather feedback on child
protection initiatives we proposed last year, we are deepening our
investment in the Communication Safety feature that we first made
available in December 2021,” the company told WIRED in a statement. “We
have further decided to not move forward with our previously proposed
CSAM detection tool for iCloud Photos. Children can be protected without
companies combing through personal data, and we will continue working
with governments, child advocates, and other companies to help protect
young people, preserve their right to privacy, and make the internet a
safer place for children and for us all.”
---

For those unaware, the Communication Safety feature is not the same
thing at all: rather than scanning photos being uploaded to iCloud to
match against known CSAM photo hashes, Communication Safety for Messages
is opt-in and analyzes image attachments users send and receive on their
devices to determine whether a photo contains nudity. The feature is
designed so Apple never gets access to the messages, the end-to-end
encryption that Messages offers is never broken, and Apple doesn't even
learn that a device has detected nudity. Everything happens on device
and the feature is only available for children's devices where the
parent can optionally enable it.

-- 
E-mail sent to this address may be devoured by my ravenous SPAM filter.
I often ignore posts from Google. Use a real news client instead.

JR