Deutsch English Français Italiano |
<lgildlFtal2U1@mid.individual.net> View for Bookmarking (what is this?) Look up another Usenet article |
Path: ...!news.nobody.at!weretis.net!feeder8.news.weretis.net!fu-berlin.de!uni-berlin.de!individual.net!not-for-mail From: Jolly Roger <jollyroger@pobox.com> Newsgroups: misc.phone.mobile.iphone Subject: Re: Apple accused of underreporting suspected CSAM on its platforms Date: 26 Jul 2024 22:14:13 GMT Organization: People for the Ethical Treatment of Pirates Lines: 60 Message-ID: <lgildlFtal2U1@mid.individual.net> References: <v7mup4$7vpf$1@solani.org> <lg8ea1Fa94U1@mid.individual.net> <xn0oonlp4azqw16000@reader443.eternal-september.org> <lga2k1F7uk8U1@mid.individual.net> <xn0oonrftb7hazk002@reader443.eternal-september.org> <v7olut$19iie$1@dont-email.me> <lga8vfF8qq0U3@mid.individual.net> <v7q9vj$1l9co$1@dont-email.me> <v7qn3b$2hg0$1@nnrp.usenet.blueworldhosting.com> <v7rclq$1r24r$2@dont-email.me> <lgdac6F3c6aU4@mid.individual.net> <v80bju$2s7ns$2@dont-email.me> <lgi05eFq6vhU2@mid.individual.net> <v80j93$2nqsm$5@dont-email.me> X-Trace: individual.net 6dGvQXUmq2GkO+Qnn+dd0g2o93utLlq3H08hFjnZSg+p9VRima Cancel-Lock: sha1:sOriNjnHNTmagGpQ2KOpeulXEpU= sha256:h0TA8EJp5KRaHkOlqQHfZPgQRA7y/3H1vo7TAJ9AWro= Mail-Copies-To: nobody X-Face: _.g>n!a$f3/H3jA]>9pN55*5<`}Tud57>1<n@LQ!aZ7vLO_nWbK~@T'XIS0,oAJcU.qLM dk/j8Udo?O"o9B9Jyx+ez2:B<nx(k3EdHnTvB]'eoVaR495,Rv~/vPa[e^JI+^h5Zk*i`Q;ezqDW< ZFs6kmAJWZjOH\8[$$7jm,Ogw3C_%QM'|H6nygNGhhl+@}n30Nz(^vWo@h>Y%b|b-Y~()~\t,LZ3e up1/bO{=-) User-Agent: slrn/1.0.3 (Darwin) Bytes: 3926 On 2024-07-26, Alan <nuh-uh@nope.com> wrote: > On 2024-07-26 09:11, Jolly Roger wrote: >> On 2024-07-26, Chris <ithinkiam@gmail.com> wrote: >>> On 24/07/2024 22:35, Jolly Roger wrote: >>>> On 2024-07-24, Chris <ithinkiam@gmail.com> wrote: >>>>> Andrew <andrew@spam.net> wrote: >>>>>> Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) : >>>>>> >>>>>>> The NSPCC should really be complaining at how ineffectual the >>>>>>> tech companies are rather than complain at Apple for not sending >>>>>>> millions of photos to already overwhelmed authorities. >>>>>> >>>>>> For all that is in the news stories, it could be ZERO convictions >>>>>> resulted. >>>>>> >>>>>> Think about that. >>>>>> >>>>>> Is it worth everyone's loss of privacy for maybe zero gain in >>>>>> child safety? >>>>> >>>>> Apple's solution wouldn't have resulted in any additional loss of >>>>> privacy >>>> >>>> Actually, Apple could not guarantee that, and there was a non-zero >>>> chance that false positive matches would result in privacy >>>> violations. >>> >>> True. The balance of risk was proportionate, however. Much moreso >>> than the current system. >> >> Absolutely. I'm just of the opinion if one innocent person is harmed, >> that's one too many. Would you want to be that unlucky innocent >> person who has to deal with charges, a potential criminal sexual >> violation on your record, and all that comes with it? I certainly >> wouldn't. > > Except that Apple's system wouldn't automatically trigger charges. > > An actual human would review the images in question... And at that point, someone's privacy may be violated. Do you want a stranger looking at photos of your sick child? What if that stranger came to the conclusion that those photos are somehow classifiable as sexual or abusive in some way? Would you want to have to argue your case in court because of it? > ...AND since they were comparing images against KNOWN CSAM, false > positives would naturally be very few to begin with. Yes, but one is one too many in my book. Apple was wise to shelve this proposal. And I am happy to see that they embraced more private features such as the Safe Communication feature which is done without violating customers' privacy. -- E-mail sent to this address may be devoured by my ravenous SPAM filter. I often ignore posts from Google. Use a real news client instead. JR