Path: ...!3.eu.feeder.erje.net!feeder.erje.net!fu-berlin.de!uni-berlin.de!individual.net!not-for-mail From: Jolly Roger Newsgroups: misc.phone.mobile.iphone Subject: Re: Apple accused of underreporting suspected CSAM on its platforms Date: 27 Jul 2024 16:55:11 GMT Organization: People for the Ethical Treatment of Pirates Lines: 87 Message-ID: References: X-Trace: individual.net CqzGNSRUQxB3h55ecfSSCgHT5HXMViFs8M/FiWCab4KvbCl3TL Cancel-Lock: sha1:6X6Gy8lQ9Nq+UQzaAxKyG6P3DAs= sha256:uRexLRtH1LXLdy0kwpnv+OR7AUlWBja8LSYCDrmbsqs= Mail-Copies-To: nobody X-Face: _.g>n!a$f3/H3jA]>9pN55*5<`}Tud57>1Y%b|b-Y~()~\t,LZ3e up1/bO{=-) User-Agent: slrn/1.0.3 (Darwin) Bytes: 4856 On 2024-07-26, Alan wrote: > On 2024-07-26 15:14, Jolly Roger wrote: >> On 2024-07-26, Alan wrote: >>> On 2024-07-26 09:11, Jolly Roger wrote: >>>> On 2024-07-26, Chris wrote: >>>>> On 24/07/2024 22:35, Jolly Roger wrote: >>>>>> On 2024-07-24, Chris wrote: >>>>>>> Andrew wrote: >>>>>>>> Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) : >>>>>>>> >>>>>>>>> The NSPCC should really be complaining at how ineffectual the >>>>>>>>> tech companies are rather than complain at Apple for not sending >>>>>>>>> millions of photos to already overwhelmed authorities. >>>>>>>> >>>>>>>> For all that is in the news stories, it could be ZERO convictions >>>>>>>> resulted. >>>>>>>> >>>>>>>> Think about that. >>>>>>>> >>>>>>>> Is it worth everyone's loss of privacy for maybe zero gain in >>>>>>>> child safety? >>>>>>> >>>>>>> Apple's solution wouldn't have resulted in any additional loss of >>>>>>> privacy >>>>>> >>>>>> Actually, Apple could not guarantee that, and there was a non-zero >>>>>> chance that false positive matches would result in privacy >>>>>> violations. >>>>> >>>>> True. The balance of risk was proportionate, however. Much moreso >>>>> than the current system. >>>> >>>> Absolutely. I'm just of the opinion if one innocent person is harmed, >>>> that's one too many. Would you want to be that unlucky innocent >>>> person who has to deal with charges, a potential criminal sexual >>>> violation on your record, and all that comes with it? I certainly >>>> wouldn't. >>> >>> Except that Apple's system wouldn't automatically trigger charges. >>> >>> An actual human would review the images in question... >> >> And at that point, someone's privacy may be violated. Do you want a >> stranger looking at photos of your sick child? What if that stranger >> came to the conclusion that those photos are somehow classifiable as >> sexual or abusive in some way? Would you want to have to argue your case >> in court because of it? > > Yes. At that point... > > ...if and only if the person is INNOCENT... > > ...someone's privacy is unnecessarily violated. > > And it's a stretch to imagine that: > > 1. Innocent pictures would be matched with KNOWN CSAM images, AND; Not it's not. There was a margin of error in the proposed matching algorithms. > (the logical AND) > > 2. A person reviewing those images after they've been flagged wouldn't > notice they don't actually match; AND That decision is a human one, and humans make mistakes and have biased beliefs that can lead them to make faulty decisions. > 3. The owner of those images at that point would be charged when they > could then show that they were in fact innocent images. Innocent people shouldn't have to prove anything to anyone. >> Yes, but one is one too many in my book. > > And yet you are fine with innocent people's privacy being violated > when a search warrant is issued erroneously. Search warrants require probable cause and are signed by a judge. Totally different scenario. -- E-mail sent to this address may be devoured by my ravenous SPAM filter. I often ignore posts from Google. Use a real news client instead. JR