Warning: mysqli::__construct(): (HY000/1203): User howardkn already has more than 'max_user_connections' active connections in D:\Inetpub\vhosts\howardknight.net\al.howardknight.net\includes\artfuncs.php on line 21
Failed to connect to MySQL: (1203) User howardkn already has more than 'max_user_connections' active connections
Warning: mysqli::query(): Couldn't fetch mysqli in D:\Inetpub\vhosts\howardknight.net\al.howardknight.net\index.php on line 66
Article <lgpqjfF17veU1@mid.individual.net>
Deutsch   English   Français   Italiano  
<lgpqjfF17veU1@mid.individual.net>

View for Bookmarking (what is this?)
Look up another Usenet article

Path: ...!feeds.phibee-telecom.net!2.eu.feeder.erje.net!feeder.erje.net!fu-berlin.de!uni-berlin.de!individual.net!not-for-mail
From: Jolly Roger <jollyroger@pobox.com>
Newsgroups: misc.phone.mobile.iphone
Subject: Re: Apple accused of underreporting suspected CSAM on its platforms
Date: 29 Jul 2024 15:25:36 GMT
Organization: People for the Ethical Treatment of Pirates
Lines: 120
Message-ID: <lgpqjfF17veU1@mid.individual.net>
References: <v7mup4$7vpf$1@solani.org> <lg8ea1Fa94U1@mid.individual.net>
 <xn0oonlp4azqw16000@reader443.eternal-september.org>
 <lga2k1F7uk8U1@mid.individual.net>
 <xn0oonrftb7hazk002@reader443.eternal-september.org>
 <v7olut$19iie$1@dont-email.me> <lga8vfF8qq0U3@mid.individual.net>
 <v7q9vj$1l9co$1@dont-email.me>
 <v7qn3b$2hg0$1@nnrp.usenet.blueworldhosting.com>
 <v7rclq$1r24r$2@dont-email.me> <lgdac6F3c6aU4@mid.individual.net>
 <v80bju$2s7ns$2@dont-email.me> <lgi05eFq6vhU2@mid.individual.net>
 <v80j93$2nqsm$5@dont-email.me> <lgildlFtal2U1@mid.individual.net>
 <v85r6s$mgr$1@dont-email.me> <lgni5gFl7shU3@mid.individual.net>
 <v87gf2$d1eu$1@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=utf-8
Content-Transfer-Encoding: 8bit
X-Trace: individual.net c7mcV9dpP+nbWGBhnGXEUADTNB+TNSPsQLrAvPeELqLXSBFOqC
Cancel-Lock: sha1:YdZHOHsJXwCakEDmZpo9qMlBX+k= sha256:iMMLImpi46Q7pFHJ8vAo8AsE5HwW+qQ1eE1tLgr6uhE=
Mail-Copies-To: nobody
X-Face: _.g>n!a$f3/H3jA]>9pN55*5<`}Tud57>1<n@LQ!aZ7vLO_nWbK~@T'XIS0,oAJcU.qLM
 dk/j8Udo?O"o9B9Jyx+ez2:B<nx(k3EdHnTvB]'eoVaR495,Rv~/vPa[e^JI+^h5Zk*i`Q;ezqDW<
 ZFs6kmAJWZjOH\8[$$7jm,Ogw3C_%QM'|H6nygNGhhl+@}n30Nz(^vWo@h>Y%b|b-Y~()~\t,LZ3e
 up1/bO{=-)
User-Agent: slrn/1.0.3 (Darwin)
Bytes: 6908

On 2024-07-29, Chris <ithinkiam@gmail.com> wrote:
> Jolly Roger <jollyroger@pobox.com> wrote:
>> On 2024-07-28, Chris <ithinkiam@gmail.com> wrote:
>> 
>>> No-one is going to be charged for a dubious photo of their own
>>> child. There are much bigger fish to fry and get into jail. 
>> 
>> You're wrong. It has already happened:
>>
>> A Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged
>> Him as a Criminal <https://archive.is/78Pla#selection-563.0-1075.217>
>
> I explicitly said "charged". No-one got charged. The law is working
> just fine. It's the tech, as I've been arguing all along, that's the
> problem. 

So it's okay that these parents and their child had their privacy
violated, their child's naked photos added to the CSAM database, and
their accounts (along with years of emails, photos, and so on)
revoked/deleted because they were officially charged?

>> Read the whole article to get a glimpse of what innocent people go
>> through who fall victim to this invasive scanning. 
>> 
>> Do you think these parents and their child consider their privacy to
>> be violated? How would you feel if your intimate photos were added to
>> the PhotoDNA CSAM database because they were incorrectly flagged?
>
> This wasn't PhotoDNA, which is what Apple was similar to. It was
> google's AI method that is designed to "recognize never-before-seen
> exploitative images of children" which is where the real danger sits. 
>
> It is designed to identify new abuse images based on only the pixel
> data so all hits will be massively enriched for things that look like
> abuse. A human won't have the ability to accurately identify the
> (likely innocent) motivation for taking photo and "to be safe" will
> pass it onto someone else make the decision i.e. law enforcement. The
> LE will have access to much more information and see it's an obvious
> mistake as seen in your article. 

Actually, a human being does review it with Google's system:

--- 
A human content moderator for Google would have reviewed the photos
after they were flagged by the artificial intelligence to confirm they
met the federal definition of child sexual abuse material. When Google
makes such a discovery, it locks the user’s account, searches for other
exploitative material and, as required by federal law, makes a report to
the CyberTipline at the National Center for Missing and Exploited
Children.  
---

> Apple's system was more like hashing the image data and comparing
> hashes where false positives are due to  algorithmic randomness. The
> pixel data when viewed by a human won't be anything like CSAM and an
> easy decision made. 
>
> What's crucial here is that Google are looking for new stuff - which
> is always problematic - whereas Apple's was not. The search space when
> looking for existing images is much tinier and the impact of false
> positives much, much smaller. 

Yes, but even in Apple's case, there's a small change of a false
positive patch.  And were that to happen, there is a danger of an
innocent person's privacy being violated.
  
>>> How many children are you prepared to be abused to protect YOUR
>>> privacy? 
>> 
>> Now you're being absurd. My right to privacy doesn't cause any
>> children to be abused. 
>
> That's what you'd like to think, yet the reality is that not only is
> it harder to identify perpetrators, but also, ironically, your
> position ensures more people get erroneously labelled. 

Nope. I don't support any scanning of private content. 

>>>> Apple was wise to shelve this proposal. And I am happy to see that
>>>> they embraced more private features such as the Safe Communication
>>>> feature which is done without violating customers' privacy. 
>>> 
>>> It wasn't violating anyone's privacy. For the umpteenth time. It
>>> actually preserved people's privacy by design. 
>> 
>> While it went further than the rest to protect people's privacy,
>> there was still room for error and for innocent people to be harmed.
>> That's the truth you seem to want to disregard, and that's why Apple
>> shelved it.
>
> My truth is that the Apple method was a significant improvement.i

I agree - still not good enough for me though.

> Plus if people didn't like it - unreasonably - they didn't have to use
> icloud.

True. 

> Apple only shelved it for PR reasons, which is a real shame. 

You don't know all of Apple's motivations. What we know is Apple shelved
it after gathering feedback from industry experts. And many of those
experts were of the opinion that even with Apple's precautions, the risk
of violating people's privacy was too great.

> What you're disregarding is that the alternative hasn't been more
> privacy for people, it's been less privacy and more errors. A
> lose-lose situation. 
>
> You want to live in a world that doesn't exist. 

Nah. I simply choose not to engage with those alternatives for things I
consider private. 

-- 
E-mail sent to this address may be devoured by my ravenous SPAM filter.
I often ignore posts from Google. Use a real news client instead.

JR