Deutsch English Français Italiano |
<lgfa27Fcg0lU3@mid.individual.net> View for Bookmarking (what is this?) Look up another Usenet article |
Path: ...!weretis.net!feeder8.news.weretis.net!fu-berlin.de!uni-berlin.de!individual.net!not-for-mail From: Jolly Roger <jollyroger@pobox.com> Newsgroups: misc.phone.mobile.iphone,alt.privacy Subject: Re: Apple accused of underreporting suspected CSAM on its platforms Date: 25 Jul 2024 15:41:59 GMT Organization: People for the Ethical Treatment of Pirates Lines: 35 Message-ID: <lgfa27Fcg0lU3@mid.individual.net> References: <v7mup4$7vpf$1@solani.org> <v7o49q$16cpi$1@dont-email.me> <v7obdd$17fqi$2@dont-email.me> <v7olme$19gq3$1@dont-email.me> <lga95bF8qq0U4@mid.individual.net> <v7q92v$1l4i5$1@dont-email.me> <v7qnha$bnp$1@nnrp.usenet.blueworldhosting.com> <lgclidFbd6U2@mid.individual.net> <v7revb$182$1@nnrp.usenet.blueworldhosting.com> <lgda13F3c6aU1@mid.individual.net> <v7tm0h$1nmm$1@nnrp.usenet.blueworldhosting.com> Mime-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 8bit X-Trace: individual.net Df32AJU4+0Jfz1MUQFMcYAMYhAaIOQ1KFeFr9aZEMYTQfXbYuF Cancel-Lock: sha1:4fGTIwofLtMPsiHyWYlGFRLNVMk= sha256:aR67LeQlunCfM8vRf/1jDb+SJx43Nvdc2rr5zbGEj/4= Mail-Copies-To: nobody X-Face: _.g>n!a$f3/H3jA]>9pN55*5<`}Tud57>1<n@LQ!aZ7vLO_nWbK~@T'XIS0,oAJcU.qLM dk/j8Udo?O"o9B9Jyx+ez2:B<nx(k3EdHnTvB]'eoVaR495,Rv~/vPa[e^JI+^h5Zk*i`Q;ezqDW< ZFs6kmAJWZjOH\8[$$7jm,Ogw3C_%QM'|H6nygNGhhl+@}n30Nz(^vWo@h>Y%b|b-Y~()~\t,LZ3e up1/bO{=-) User-Agent: slrn/1.0.3 (Darwin) Bytes: 2815 On 2024-07-25, Andrew <andrew@spam.net> wrote: > Jolly Roger wrote on 24 Jul 2024 21:29:07 GMT : > >>> What matters is the percentage >> >> No, words have meanings, and zero means zero. And there is a >> higher-than-zero number of pedophiles who have been caught due to >> CSAM scanning. Unfortunately, there is also a higher-than-zero number >> of innocent people whose privacy was violated in the process. > > While I support blah blah blah Nothing you can say will change the fact that a greater-than-zero number of people have been convicted from CSAM scanning - just like nothing you can say will convince me that CSAM scanning can be done without violating the privacy of innocent people. Things like this should not happen: Google AI flagged parents’ accounts for potential abuse over nude photos of their sick kids <https://www.theverge.com/2022/8/21/23315513/google-photos-csam-scanning-account-deletion-investigation> And while Apple did their best to prevent such things from happening with their proposal, they could not guarantee it would not happen, which is why they ended up scrapping the proposal. > Nothing else has any meaning. Nah. -- E-mail sent to this address may be devoured by my ravenous SPAM filter. I often ignore posts from Google. Use a real news client instead. JR