Warning: mysqli::__construct(): (HY000/1203): User howardkn already has more than 'max_user_connections' active connections in D:\Inetpub\vhosts\howardknight.net\al.howardknight.net\includes\artfuncs.php on line 21
Failed to connect to MySQL: (1203) User howardkn already has more than 'max_user_connections' active connections
Warning: mysqli::query(): Couldn't fetch mysqli in D:\Inetpub\vhosts\howardknight.net\al.howardknight.net\index.php on line 66
Article <v83flq$3a095$5@dont-email.me>
Deutsch   English   Français   Italiano  
<v83flq$3a095$5@dont-email.me>

View for Bookmarking (what is this?)
Look up another Usenet article

Path: ...!news.mixmin.net!eternal-september.org!feeder3.eternal-september.org!news.eternal-september.org!.POSTED!not-for-mail
From: Alan <nuh-uh@nope.com>
Newsgroups: misc.phone.mobile.iphone
Subject: Re: Apple accused of underreporting suspected CSAM on its platforms
Date: Sat, 27 Jul 2024 11:52:41 -0700
Organization: A noiseless patient Spider
Lines: 105
Message-ID: <v83flq$3a095$5@dont-email.me>
References: <v7mup4$7vpf$1@solani.org> <lg8ea1Fa94U1@mid.individual.net>
 <xn0oonlp4azqw16000@reader443.eternal-september.org>
 <lga2k1F7uk8U1@mid.individual.net>
 <xn0oonrftb7hazk002@reader443.eternal-september.org>
 <v7olut$19iie$1@dont-email.me> <lga8vfF8qq0U3@mid.individual.net>
 <v7q9vj$1l9co$1@dont-email.me>
 <v7qn3b$2hg0$1@nnrp.usenet.blueworldhosting.com>
 <v7rclq$1r24r$2@dont-email.me> <lgdac6F3c6aU4@mid.individual.net>
 <v80bju$2s7ns$2@dont-email.me> <lgi05eFq6vhU2@mid.individual.net>
 <v80j93$2nqsm$5@dont-email.me> <lgildlFtal2U1@mid.individual.net>
 <v81a8d$31o1l$1@dont-email.me> <lgkn3fF87a5U1@mid.individual.net>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Date: Sat, 27 Jul 2024 20:52:42 +0200 (CEST)
Injection-Info: dont-email.me; posting-host="a727c11014ca40347625e8d70f473602";
	logging-data="3473701"; mail-complaints-to="abuse@eternal-september.org";	posting-account="U2FsdGVkX1+8Ol21wBNQXMwZQJ/27aBQ0z365XOT4ew="
User-Agent: Mozilla Thunderbird
Cancel-Lock: sha1:x78E8LjCq4CV06KCpIrFLdguwCk=
In-Reply-To: <lgkn3fF87a5U1@mid.individual.net>
Content-Language: en-CA
Bytes: 5601

On 2024-07-27 09:55, Jolly Roger wrote:
> On 2024-07-26, Alan <nuh-uh@nope.com> wrote:
>> On 2024-07-26 15:14, Jolly Roger wrote:
>>> On 2024-07-26, Alan <nuh-uh@nope.com> wrote:
>>>> On 2024-07-26 09:11, Jolly Roger wrote:
>>>>> On 2024-07-26, Chris <ithinkiam@gmail.com> wrote:
>>>>>> On 24/07/2024 22:35, Jolly Roger wrote:
>>>>>>> On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:
>>>>>>>> Andrew <andrew@spam.net> wrote:
>>>>>>>>> Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :
>>>>>>>>>
>>>>>>>>>> The NSPCC should really be complaining at how ineffectual the
>>>>>>>>>> tech companies are rather than complain at Apple for not sending
>>>>>>>>>> millions of photos to already overwhelmed authorities.
>>>>>>>>>
>>>>>>>>> For all that is in the news stories, it could be ZERO convictions
>>>>>>>>> resulted.
>>>>>>>>>
>>>>>>>>> Think about that.
>>>>>>>>>
>>>>>>>>> Is it worth everyone's loss of privacy for maybe zero gain in
>>>>>>>>> child safety?
>>>>>>>>
>>>>>>>> Apple's solution wouldn't have resulted in any additional loss of
>>>>>>>> privacy
>>>>>>>
>>>>>>> Actually, Apple could not guarantee that, and there was a non-zero
>>>>>>> chance that false positive matches would result in privacy
>>>>>>> violations.
>>>>>>
>>>>>> True. The balance of risk was proportionate, however. Much moreso
>>>>>> than the current system.
>>>>>
>>>>> Absolutely. I'm just of the opinion if one innocent person is harmed,
>>>>> that's one too many. Would you want to be that unlucky innocent
>>>>> person who has to deal with charges, a potential criminal sexual
>>>>> violation on your record, and all that comes with it? I certainly
>>>>> wouldn't.
>>>>
>>>> Except that Apple's system wouldn't automatically trigger charges.
>>>>
>>>> An actual human would review the images in question...
>>>
>>> And at that point, someone's privacy may be violated. Do you want a
>>> stranger looking at photos of your sick child? What if that stranger
>>> came to the conclusion that those photos are somehow classifiable as
>>> sexual or abusive in some way? Would you want to have to argue your case
>>> in court because of it?
>>
>> Yes. At that point...
>>
>> ...if and only if the person is INNOCENT...
>>
>> ...someone's privacy is unnecessarily violated.
>>
>> And it's a stretch to imagine that:
>>
>> 1. Innocent pictures would be matched with KNOWN CSAM images, AND;
> 
> Not it's not. There was a margin of error in the proposed matching
> algorithms.

I'm not saying it's impossible. Just very unlikely.

> 
>> (the logical AND)
>>
>> 2. A person reviewing those images after they've been flagged wouldn't
>> notice they don't actually match; AND
> 
> That decision is a human one, and humans make mistakes and have biased
> beliefs that can lead them to make faulty decisions.

I'm not saying it's impossible. Just very unlikely.

> 
>> 3. The owner of those images at that point would be charged when they
>> could then show that they were in fact innocent images.
> 
> Innocent people shouldn't have to prove anything to anyone.

We've already had two very statistically unlikely events that had to 
happen to get to this point.

At this point, it's pretty much equivalent to the search warrant. 
Information that suggests a crime might have been committed has been 
communicated to the authorities. And while someone's pictures have been 
examined by a human, that person doesn't know they've been examined, so 
where is the damage?

> 
>>> Yes, but one is one too many in my book.
>>
>> And yet you are fine with innocent people's privacy being violated
>> when a search warrant is issued erroneously.
> 
> Search warrants require probable cause and are signed by a judge.
> Totally different scenario.

But innocent people do get searched...

....and you literally just said:

'Innocent people shouldn't have to prove anything to anyone.'