Path: ...!weretis.net!feeder9.news.weretis.net!2.eu.feeder.erje.net!feeder.erje.net!fu-berlin.de!uni-berlin.de!individual.net!not-for-mail From: Jolly Roger Newsgroups: misc.phone.mobile.iphone Subject: Re: Apple accused of underreporting suspected CSAM on its platforms Date: 24 Jul 2024 21:35:02 GMT Organization: People for the Ethical Treatment of Pirates Lines: 33 Message-ID: References: X-Trace: individual.net a8rJWE2PrypBAhfQ1IDQOwf4f5UHnIAkxTLovFL6QIg0vAqXr9 Cancel-Lock: sha1:snWIAHw/Ev/DbjJimxZmgmzjQL8= sha256:IrrB/YSs5VWHh4lXssHDJPjPpO4eZSfkl83gibp0iW8= Mail-Copies-To: nobody X-Face: _.g>n!a$f3/H3jA]>9pN55*5<`}Tud57>1Y%b|b-Y~()~\t,LZ3e up1/bO{=-) User-Agent: slrn/1.0.3 (Darwin) Bytes: 2506 On 2024-07-24, Chris wrote: > Andrew wrote: >> Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) : >> >>> The NSPCC should really be complaining at how ineffectual the tech >>> companies are rather than complain at Apple for not sending millions >>> of photos to already overwhelmed authorities. >> >> For all that is in the news stories, it could be ZERO convictions >> resulted. >> >> Think about that. >> >> Is it worth everyone's loss of privacy for maybe zero gain in child >> safety? > > Apple's solution wouldn't have resulted in any additional loss of > privacy Actually, Apple could not guarantee that, and there was a non-zero chance that false positive matches would result in privacy violations. > plus it only affected customers of icloud. Don't like it? Don't use > icloud. Simple. That much is true. Only images uploaded to iCloud would have been examined by the algorithm. -- E-mail sent to this address may be devoured by my ravenous SPAM filter. I often ignore posts from Google. Use a real news client instead. JR