Deutsch   English   Français   Italiano  
<v89b1n$mtmj$1@dont-email.me>

View for Bookmarking (what is this?)
Look up another Usenet article

Path: ...!2.eu.feeder.erje.net!feeder.erje.net!eternal-september.org!feeder3.eternal-september.org!news.eternal-september.org!.POSTED!not-for-mail
From: Chips Loral <loralandclinton@invalid.co>
Newsgroups: misc.phone.mobile.iphone,alt.privacy
Subject: Re: Apple accused of underreporting suspected CSAM on its platforms
Date: Mon, 29 Jul 2024 18:10:29 -0600
Organization: A noiseless patient Spider
Lines: 213
Message-ID: <v89b1n$mtmj$1@dont-email.me>
References: <v7mup4$7vpf$1@solani.org> <lg8ea1Fa94U1@mid.individual.net>
 <xn0oonlp4azqw16000@reader443.eternal-september.org>
 <lga2k1F7uk8U1@mid.individual.net>
 <xn0oonrftb7hazk002@reader443.eternal-september.org>
 <v7olut$19iie$1@dont-email.me> <lga8vfF8qq0U3@mid.individual.net>
 <v7q9vj$1l9co$1@dont-email.me>
 <v7qn3b$2hg0$1@nnrp.usenet.blueworldhosting.com>
 <lgcm26Fbd6U4@mid.individual.net>
 <v7rfko$18sp$1@nnrp.usenet.blueworldhosting.com>
 <lgda8oF3c6aU3@mid.individual.net>
 <v7tiln$2g2b$1@nnrp.usenet.blueworldhosting.com>
 <v7u2gu$2co3l$1@dont-email.me> <lgfmvgFee44U1@mid.individual.net>
 <v7uduv$11n1$1@nnrp.usenet.blueworldhosting.com>
 <v80bgk$2s7ns$1@dont-email.me>
 <v83f93$31ac$1@nnrp.usenet.blueworldhosting.com>
 <v860er$1kea$1@dont-email.me> <v86141$j$1@nnrp.usenet.blueworldhosting.com>
 <v87e4d$ckpm$1@dont-email.me>
 <v87u43$1kl6$1@nnrp.usenet.blueworldhosting.com>
 <v88npo$j33d$8@dont-email.me> <v8943a$lrfd$1@dont-email.me>
 <v89858$jup0$14@dont-email.me>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Tue, 30 Jul 2024 02:10:32 +0200 (CEST)
Injection-Info: dont-email.me; posting-host="a8be7c9698dc542441c2da566c864e82";
	logging-data="751315"; mail-complaints-to="abuse@eternal-september.org";	posting-account="U2FsdGVkX1+ADU5FOsFeJkFGjn9P0q/9tkUneC4DfjE="
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:91.0) Gecko/20100101
 Firefox/91.0 SeaMonkey/2.53.18.2
Cancel-Lock: sha1:rvg0LdtEzO2pZBbBadvWWMZfpBw=
In-Reply-To: <v89858$jup0$14@dont-email.me>
Bytes: 11135

Alan wrote:
> On 2024-07-29 15:11, Chips Loral wrote:
>> Alan wrote:
>>> On 2024-07-29 04:23, Andrew wrote:
>>>> Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :
>>>>
>>>>>> You not comprehending the difference between zero percent of Apple 
>>>>>> reports
>>>>>> versus zero total convictions is how I know you zealots own 
>>>>>> subnormal IQs.
>>>>>
>>>>> Not at all. My position hasn't changed. You, however, have had 
>>>>> about three
>>>>> different positions on this thread and keep getting confused which one
>>>>> you're arguing for. lol.
>>>>
>>>> Au contraire
>>>>
>>>> Because I only think logically, my rather sensible position has never
>>>> changed, Chris, and the fact you "think" it has changed is simply 
>>>> that you
>>>> don't know the difference between the percentage of convictions 
>>>> based on
>>>> the number of reports, and the total number of convictions.
>>>>
>>>> When you figure out that those two things are different, then (and only
>>>> then) will you realize I've maintained the same position throughout.
>>>>
>>>> Specifically....
>>>>
>>>> a. If the Apple reporting rate is low, and yet if their conviction
>>>>     rate is high (based on the number of reports), then they are NOT
>>>>     underreporting images.
>>>
>>> Apple's reporting rate is ZERO, because they're not doing scanning of 
>>> images of any kind.
>>
>> After getting caught.
>>
>> You can't seem to get ANYTHING right, Mac-troll:
>>
>> https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/ 
>>
>>
>> In August 2021, Apple announced a plan to scan photos that users 
>> stored in iCloud for child sexual abuse material (CSAM). The tool was 
>> meant to be privacy-preserving and allow the company to flag 
>> potentially problematic and abusive content without revealing anything 
>> else. But the initiative was controversial, and it soon drew 
>> widespread criticism from privacy and security researchers and digital 
>> rights groups who were concerned that the surveillance capability 
>> itself could be abused to undermine the privacy and security of iCloud 
>> users around the world. At the beginning of September 2021, Apple said 
>> it would pause the rollout of the feature to “collect input and make 
>> improvements before releasing these critically important child safety 
>> features.” In other words, a launch was still coming.
>>
>> Parents and caregivers can opt into the protections through family 
>> iCloud accounts. The features work in Siri, Apple’s Spotlight search, 
>> and Safari Search to warn if someone is looking at or searching for 
>> child sexual abuse materials and provide resources on the spot to 
>> report the content and seek help.
>>
>> https://sneak.berlin/20230115/macos-scans-your-local-files-now/
>>
>> Preface: I don’t use iCloud. I don’t use an Apple ID. I don’t use the 
>> Mac App Store. I don’t store photos in the macOS “Photos” application, 
>> even locally. I never opted in to Apple network services of any kind - 
>> I use macOS software on Apple hardware.
>>
>> Today, I was browsing some local images in a subfolder of my Documents 
>> folder, some HEIC files taken with an iPhone and copied to the Mac 
>> using the Image Capture program (used for dumping photos from an iOS 
>> device attached with an USB cable).
>>
>> I use a program called Little Snitch which alerts me to network 
>> traffic attempted by the programs I use. I have all network access 
>> denied for a lot of Apple OS-level apps because I’m not interested in 
>> transmitting any of my data whatsoever to Apple over the network - 
>> mostly because Apple turns over customer data on over 30,000 customers 
>> per year to US federal police without any search warrant per Apple’s 
>> own self-published transparency report. I’m good without any of that 
>> nonsense, thank you.
>>
>> Imagine my surprise when browsing these images in the Finder, Little 
>> Snitch told me that macOS is now connecting to Apple APIs via a 
>> program named mediaanalysisd (Media Analysis Daemon - a background 
>> process for analyzing media files).
>>
>> ...
>>
>>
>> Integrate this data and remember it: macOS now contains network-based 
>> spyware even with all Apple services disabled. It cannot be disabled 
>> via controls within the OS: you must used third party network 
>> filtering software (or external devices) to prevent it.
>>
>> This was observed on the current version of macOS, macOS Ventura 13.1.
>>
> 
> 'A recent thread on Twitter raised concerns that the macOS process 
> mediaanalysisd, which scans local photos, was secretly sending the 
> results to an Apple server. This claim was made by a cybersecurity 
> researcher named Jeffrey Paul. However, after conducting a thorough 
> analysis of the process, it has been determined that this is not the case.'
> 


Bullshit.

https://www.majorgeeks.com/content/page/stop_apple_scanning_iphone_photos.html

Apple’s new iPhone photo-scanning feature is a very controversial thing. 
You might want to consider the only current option to stop Apple from 
scanning your photos.

Apple's new photo-scanning feature will scan photos stored in iCloud to 
see whether they match known Child Sexual Abuse Material (CSAM). The 
problem with this, like many others, is that we often have hundreds of 
photos of our children and grandchildren, and who knows how good or bad 
the new software scanning technology is? Apple claims false positives 
are one trillion to one, and there is an appeals process in place. That 
said, one mistake from this AI, just one, could have an innocent person 
sent to jail and their lives destroyed.

Apple has many other features as part of these upgrades to protect 
children, and we like them all, but photo-scanning sounds like a problem 
waiting to happen.

Here are all of the "features" that come with anti-CSAM, expected to 
roll out with iOS 15 in the fall of 2021.

Messages: The Messages app will use on-device machine learning to warn 
children and parents about sensitive content.

iCloud Photos: Before an image is stored in iCloud Photos, an on-device 
matching process is performed for that image against the known CSAM hashes.

Siri and Search: Siri and Search will provide additional resources to 
help children and parents stay safe online and get help with unsafe 
situations.

Now that you understand how anti-CSAM works, the only way to avoid 
having your photos scanned by this system is to disable iCloud Photos. 
Your photos are scanned when you automatically upload your photos to the 
cloud, so the only current way to avoid having them scanned is not to 
upload them.

This adds an interesting problem. The majority of iPhone users use 
iCloud to back up their photos (and everything else). If you disable 
iCloud, you will need to back up your photos manually. If you have a PC 
or Mac, you can always copy them to your computer and back them up. You 
can also consider using another cloud service for backups.

Let's talk about disabling iCloud and also removing any photos you 
already have uploaded. You will have 30 days to recover your photos if 
you change your mind. Any photos that are on your iPhone when iOS 15 is 
released will be scanned.

You'll want to backup and disable iCloud, then verify that no photos 
========== REMAINDER OF ARTICLE TRUNCATED ==========