Warning: mysqli::__construct(): (HY000/1203): User howardkn already has more than 'max_user_connections' active connections in D:\Inetpub\vhosts\howardknight.net\al.howardknight.net\includes\artfuncs.php on line 21
Failed to connect to MySQL: (1203) User howardkn already has more than 'max_user_connections' active connections
Warning: mysqli::query(): Couldn't fetch mysqli in D:\Inetpub\vhosts\howardknight.net\al.howardknight.net\index.php on line 66
Article <lgqna1F5ej4U1@mid.individual.net>
Deutsch   English   Français   Italiano  
<lgqna1F5ej4U1@mid.individual.net>

View for Bookmarking (what is this?)
Look up another Usenet article

Path: ...!fu-berlin.de!uni-berlin.de!individual.net!not-for-mail
From: Jolly Roger <jollyroger@pobox.com>
Newsgroups: misc.phone.mobile.iphone
Subject: Re: Apple accused of underreporting suspected CSAM on its platforms
Date: 29 Jul 2024 23:35:29 GMT
Organization: People for the Ethical Treatment of Pirates
Lines: 100
Message-ID: <lgqna1F5ej4U1@mid.individual.net>
References: <v7mup4$7vpf$1@solani.org> <lg8ea1Fa94U1@mid.individual.net>
 <xn0oonlp4azqw16000@reader443.eternal-september.org>
 <lga2k1F7uk8U1@mid.individual.net>
 <xn0oonrftb7hazk002@reader443.eternal-september.org>
 <v7olut$19iie$1@dont-email.me> <lga8vfF8qq0U3@mid.individual.net>
 <v7q9vj$1l9co$1@dont-email.me>
 <v7qn3b$2hg0$1@nnrp.usenet.blueworldhosting.com>
 <v7rclq$1r24r$2@dont-email.me> <lgdac6F3c6aU4@mid.individual.net>
 <v80bju$2s7ns$2@dont-email.me> <lgi05eFq6vhU2@mid.individual.net>
 <v80j93$2nqsm$5@dont-email.me> <lgildlFtal2U1@mid.individual.net>
 <v85r6s$mgr$1@dont-email.me> <lgni5gFl7shU3@mid.individual.net>
 <v87gf2$d1eu$1@dont-email.me> <lgpqjfF17veU1@mid.individual.net>
 <v88jf8$j1j9$1@dont-email.me>
X-Trace: individual.net /9gU1QamcWaXMUmCv0drsAmLwNDPBnGr6p8/nIt6Xyy/g9NgQd
Cancel-Lock: sha1:7JTgZB3SzNaAvG4m77disYSHiZU= sha256:WIfcuoQRAgM3T4Hu9XC09wouvA21GVzzKwzA10w7sFg=
Mail-Copies-To: nobody
X-Face: _.g>n!a$f3/H3jA]>9pN55*5<`}Tud57>1<n@LQ!aZ7vLO_nWbK~@T'XIS0,oAJcU.qLM
 dk/j8Udo?O"o9B9Jyx+ez2:B<nx(k3EdHnTvB]'eoVaR495,Rv~/vPa[e^JI+^h5Zk*i`Q;ezqDW<
 ZFs6kmAJWZjOH\8[$$7jm,Ogw3C_%QM'|H6nygNGhhl+@}n30Nz(^vWo@h>Y%b|b-Y~()~\t,LZ3e
 up1/bO{=-)
User-Agent: slrn/1.0.3 (Darwin)
Bytes: 5401

On 2024-07-29, Chris <ithinkiam@gmail.com> wrote:
> Jolly Roger <jollyroger@pobox.com> wrote:
>> 
>> Yes, but even in Apple's case, there's a small change of a false
>> positive patch.  And were that to happen, there is a danger of an
>> innocent person's privacy being violated.
>
> In every case there's a chance of FPs. Apple would have had lower FPR then
> *the current* system. 
>
> Given the choice I'm in favour of the better, evidence-based method. 

You're wrong. The choices now are:

- Use systems and services where CSAM scanning disregards your privacy.

- Use systems and services that do no CSAM scanning of private content.

The latter happens to be Apple's systems and services (with the singular
exception of email).

> You're in favour of the worse system

Nope. I have never said that. I'm in favor of no CSAM scanning of
private content.

>> Nope. I don't support any scanning of private content. 
>
> Yet it's already happening so why not support the better method?

Speak for yourself. It's certainly not happening to my private content.

>> I agree - still not good enough for me though.
>
> "Perfect is the enemy of the good"
>
> By seeking perfection you and others are allowing and enabling child
> abuse.

Nah. There is no child abuse occurring in my private content, and my
decision not to use or support privacy-violating technology isn't
harming anyone.

>>> Apple only shelved it for PR reasons, which is a real shame. 
>> 
>> You don't know all of Apple's motivations. What we know is Apple shelved
>> it after gathering feedback from industry experts. And many of those
>> experts were of the opinion that even with Apple's precautions, the risk
>> of violating people's privacy was too great.
>
> That wasn't the consensus. The noisy tin-foil brigade drowned out any
> possible discussion. i

Not true. There was plenty of collaboration and discussion. Here's what
Apple said about their decision:

---
"Child sexual abuse material is abhorrent and we are committed to
breaking the chain of coercion and influence that makes children
susceptible to it," Erik Neuenschwander, Apple's director of user
privacy and child safety, wrote in the company's response to Heat
Initiative. He added, though, that after collaborating with an array of
privacy and security researchers, digital rights groups, and child
safety advocates, the company concluded that it could not proceed with
development of a CSAM-scanning mechanism, even one built specifically to
preserve privacy.

"Scanning every user's privately stored iCloud data would create new
threat vectors for data thieves to find and exploit," Neuenschwander
wrote. "It would also inject the potential for a slippery slope of
unintended consequences. Scanning for one type of content, for instance,
opens the door for bulk surveillance and could create a desire to search
other encrypted messaging systems across content types."
---

> Apple should have simply implemented it like google are doing, but
> badly.

No thanks. I like the Apple that protects my privacy. Apparently you
don't.

>>> What you're disregarding is that the alternative hasn't been more
>>> privacy for people, it's been less privacy and more errors. A
>>> lose-lose situation. 
>>> 
>>> You want to live in a world that doesn't exist. 
>> 
>> Nah. I simply choose not to engage with those alternatives for things
>> I consider private. 
>
> Head-in-sand mentality. 

My private photos are private, and that's the way I like it. There is no
sand here.

-- 
E-mail sent to this address may be devoured by my ravenous SPAM filter.
I often ignore posts from Google. Use a real news client instead.

JR