Warning: mysqli::__construct(): (HY000/1203): User howardkn already has more than 'max_user_connections' active connections in D:\Inetpub\vhosts\howardknight.net\al.howardknight.net\includes\artfuncs.php on line 21
Failed to connect to MySQL: (1203) User howardkn already has more than 'max_user_connections' active connections
Warning: mysqli::query(): Couldn't fetch mysqli in D:\Inetpub\vhosts\howardknight.net\al.howardknight.net\index.php on line 66
Article <v7o49q$16cpi$1@dont-email.me>
Deutsch   English   Français   Italiano  
<v7o49q$16cpi$1@dont-email.me>

View for Bookmarking (what is this?)
Look up another Usenet article

Path: ...!eternal-september.org!feeder3.eternal-september.org!news.eternal-september.org!.POSTED!not-for-mail
From: Chris <ithinkiam@gmail.com>
Newsgroups: misc.phone.mobile.iphone
Subject: Re: Apple accused of underreporting suspected CSAM on its platforms
Date: Tue, 23 Jul 2024 11:31:06 -0000 (UTC)
Organization: A noiseless patient Spider
Lines: 47
Message-ID: <v7o49q$16cpi$1@dont-email.me>
References: <v7mup4$7vpf$1@solani.org>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Injection-Date: Tue, 23 Jul 2024 13:31:06 +0200 (CEST)
Injection-Info: dont-email.me; posting-host="4b91fa0d98cbc270fad464fba826ca84";
	logging-data="1258290"; mail-complaints-to="abuse@eternal-september.org";	posting-account="U2FsdGVkX1/ens3YnoaNiOgTQNoI6GjjPx1vnjb0+WQ="
User-Agent: NewsTap/5.5 (iPhone/iPod Touch)
Cancel-Lock: sha1:J4INlXhx0xXzef/+YzcmLTlrbww=
	sha1:mqfddcEINAoINDz+Yhb5DkQcZCc=
Bytes: 3791

badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
> Apple has been accused of underreporting the prevalence of child sexual
> abuse material (CSAM) on its platforms. The National Society for the
> Prevention of Cruelty to Children (NSPCC), a child protection charity in
> the UK, says that Apple reported just 267 worldwide cases of suspected CSAM
> to the National Center for Missing & Exploited Children (NCMEC) last year.
> 
> That pales in comparison to the 1.47 million potential cases that Google
> reported and 30.6 million reports from Meta. Other platforms that reported
> more potential CSAM cases than Apple in 2023 include TikTok (590,376), X
> (597,087), Snapchat (713,055), Xbox (1,537) and PlayStation/Sony
> Interactive Entertainment (3,974). Every US-based tech company is required
> to pass along any possible CSAM cases detected on their platforms to NCMEC,
> which directs cases to relevant law enforcement agencies worldwide.
> 
> As The Guardian, which first reported on the NSPCC's claim, points out,
> Apple services such as iMessage, FaceTime and iCloud all have end-to-end
> encryption, which stops the company from viewing the contents of what users
> share on them. However, WhatsApp has E2EE as well, and that service
> reported nearly 1.4 million cases of suspected CSAM to NCMEC in 2023.
> 
> “There is a concerning discrepancy between the number of UK child abuse
> image crimes taking place on Apple’s services and the almost negligible
> number of global reports of abuse content they make to authorities,”
> Richard Collard, the NSPCC's head of child safety online policy, said.
> “Apple is clearly behind many of their peers in tackling child sexual abuse
> when all tech firms should be investing in safety and preparing for the
> roll out of the Online Safety Act in the UK.”
> 
> Apple declined to comment on the NSPCC's accusation, instead pointing The
> Guardian to a statement it made when it shelved the CSAM scanning plan.
> Apple said it opted for a different strategy that “prioritizes the security
> and privacy of [its] users.” The company told Wired in August 2022 that
> "children can be protected without companies combing through personal
> data."

Interesting. However, there's no mention of what the rate of *actual* CSAM
is. Given the millions of reports from google et al how many are actually
CSAM and not false positives? Are the tech companies simply swamping
authorities with data? How does anyone cope with 30m reports from meta?

After being a bit skeptical of Apple's solution, I realised it was a pretty
good and pragmatic balance between respecting people's privacy and
protecting vulnerable people. I was disappointed that the angry "muh
freedom" brigade scuppered it.