Path: ...!3.eu.feeder.erje.net!feeder.erje.net!weretis.net!feeder8.news.weretis.net!fu-berlin.de!uni-berlin.de!individual.net!not-for-mail From: Jolly Roger Newsgroups: misc.phone.mobile.iphone Subject: Re: Apple accused of underreporting suspected CSAM on its platforms Date: 24 Jul 2024 15:34:16 GMT Organization: People for the Ethical Treatment of Pirates Lines: 37 Message-ID: References: X-Trace: individual.net neELEdhwrhynP+ZtLLSaRAXFn4W84NYqrhmO1dmTIoMvaormse Cancel-Lock: sha1:RCC8lyS2J6dcPh6eQ6OpXYQTBW4= sha256:kJMm33UBn6FXkjZ+9n8sCSNiCO6Y4C/PpSVNl6braO0= Mail-Copies-To: nobody X-Face: _.g>n!a$f3/H3jA]>9pN55*5<`}Tud57>1Y%b|b-Y~()~\t,LZ3e up1/bO{=-) User-Agent: slrn/1.0.3 (Darwin) Bytes: 2167 On 2024-07-24, Chris wrote: > Jolly Roger wrote: >> >> Apple's proposal was to match personal photos with *known* CSAM >> images. > > Correct. > >> It would do nothing to detect *new* CSAM images. > > Also correct. > >> And it could not prevent false positive matches. > > Incorrect. Nope. I am correct. It absolutely could not prevent false matches. > It is designed to avoid false positives, although nothing is 100% > perfect. If it has even .1 % false matches, then someone's privacy has been violated. >> Everyone on this planet should have a right to basic privacy. > > And they do. Tell that to the people whose private photos are scanned and are falsely accused of a crime they didn't commit because an imperfect algorithm got it wrong. -- E-mail sent to this address may be devoured by my ravenous SPAM filter. I often ignore posts from Google. Use a real news client instead. JR