Warning: mysqli::__construct(): (HY000/1203): User howardkn already has more than 'max_user_connections' active connections in D:\Inetpub\vhosts\howardknight.net\al.howardknight.net\includes\artfuncs.php on line 21
Failed to connect to MySQL: (1203) User howardkn already has more than 'max_user_connections' active connections
Warning: mysqli::query(): Couldn't fetch mysqli in D:\Inetpub\vhosts\howardknight.net\al.howardknight.net\index.php on line 66
Article <v7mup4$7vpf$1@solani.org>
Deutsch   English   Français   Italiano  
<v7mup4$7vpf$1@solani.org>

View for Bookmarking (what is this?)
Look up another Usenet article

Path: ...!news.mixmin.net!weretis.net!feeder8.news.weretis.net!reader5.news.weretis.net!news.solani.org!.POSTED!not-for-mail
From: badgolferman <REMOVETHISbadgolferman@gmail.com>
Newsgroups: misc.phone.mobile.iphone
Subject: Apple accused of underreporting suspected CSAM on its platforms
Date: Tue, 23 Jul 2024 00:50:44 -0000 (UTC)
Message-ID: <v7mup4$7vpf$1@solani.org>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
Injection-Date: Tue, 23 Jul 2024 00:50:44 -0000 (UTC)
Injection-Info: solani.org;
	logging-data="261935"; mail-complaints-to="abuse@news.solani.org"
User-Agent: NewsTap/5.5 (iPhone/iPod Touch)
Cancel-Lock: sha1:Xm7vJ5k0LyJMiiIG00M2XiG2YUk= sha1:RNpltMjHO22GfhLkbHooQGVrCWY=
X-User-ID: eJwNx8kBwDAIA7CVuGxgnJSU/Udo9ROcyskgGFisUPKaWrdNnMu6jZkk5I0aOI45qyph+Wg4fKzjv3OVLh8cCxNr
Bytes: 3137
Lines: 37

Apple has been accused of underreporting the prevalence of child sexual
abuse material (CSAM) on its platforms. The National Society for the
Prevention of Cruelty to Children (NSPCC), a child protection charity in
the UK, says that Apple reported just 267 worldwide cases of suspected CSAM
to the National Center for Missing & Exploited Children (NCMEC) last year.

That pales in comparison to the 1.47 million potential cases that Google
reported and 30.6 million reports from Meta. Other platforms that reported
more potential CSAM cases than Apple in 2023 include TikTok (590,376), X
(597,087), Snapchat (713,055), Xbox (1,537) and PlayStation/Sony
Interactive Entertainment (3,974). Every US-based tech company is required
to pass along any possible CSAM cases detected on their platforms to NCMEC,
which directs cases to relevant law enforcement agencies worldwide.

As The Guardian, which first reported on the NSPCC's claim, points out,
Apple services such as iMessage, FaceTime and iCloud all have end-to-end
encryption, which stops the company from viewing the contents of what users
share on them. However, WhatsApp has E2EE as well, and that service
reported nearly 1.4 million cases of suspected CSAM to NCMEC in 2023.

“There is a concerning discrepancy between the number of UK child abuse
image crimes taking place on Apple’s services and the almost negligible
number of global reports of abuse content they make to authorities,”
Richard Collard, the NSPCC's head of child safety online policy, said.
“Apple is clearly behind many of their peers in tackling child sexual abuse
when all tech firms should be investing in safety and preparing for the
roll out of the Online Safety Act in the UK.”

Apple declined to comment on the NSPCC's accusation, instead pointing The
Guardian to a statement it made when it shelved the CSAM scanning plan.
Apple said it opted for a different strategy that “prioritizes the security
and privacy of [its] users.” The company told Wired in August 2022 that
"children can be protected without companies combing through personal
data."



https://www.engadget.com/apple-accused-of-underreporting-suspected-csam-on-its-platforms-153637726.html