Deutsch English Français Italiano |
<v89858$jup0$14@dont-email.me> View for Bookmarking (what is this?) Look up another Usenet article |
Path: ...!weretis.net!feeder8.news.weretis.net!eternal-september.org!feeder3.eternal-september.org!news.eternal-september.org!.POSTED!not-for-mail From: Alan <nuh-uh@nope.com> Newsgroups: misc.phone.mobile.iphone,alt.privacy Subject: Re: Apple accused of underreporting suspected CSAM on its platforms Date: Mon, 29 Jul 2024 16:21:12 -0700 Organization: A noiseless patient Spider Lines: 113 Message-ID: <v89858$jup0$14@dont-email.me> References: <v7mup4$7vpf$1@solani.org> <lg8ea1Fa94U1@mid.individual.net> <xn0oonlp4azqw16000@reader443.eternal-september.org> <lga2k1F7uk8U1@mid.individual.net> <xn0oonrftb7hazk002@reader443.eternal-september.org> <v7olut$19iie$1@dont-email.me> <lga8vfF8qq0U3@mid.individual.net> <v7q9vj$1l9co$1@dont-email.me> <v7qn3b$2hg0$1@nnrp.usenet.blueworldhosting.com> <lgcm26Fbd6U4@mid.individual.net> <v7rfko$18sp$1@nnrp.usenet.blueworldhosting.com> <lgda8oF3c6aU3@mid.individual.net> <v7tiln$2g2b$1@nnrp.usenet.blueworldhosting.com> <v7u2gu$2co3l$1@dont-email.me> <lgfmvgFee44U1@mid.individual.net> <v7uduv$11n1$1@nnrp.usenet.blueworldhosting.com> <v80bgk$2s7ns$1@dont-email.me> <v83f93$31ac$1@nnrp.usenet.blueworldhosting.com> <v860er$1kea$1@dont-email.me> <v86141$j$1@nnrp.usenet.blueworldhosting.com> <v87e4d$ckpm$1@dont-email.me> <v87u43$1kl6$1@nnrp.usenet.blueworldhosting.com> <v88npo$j33d$8@dont-email.me> <v8943a$lrfd$1@dont-email.me> MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8; format=flowed Content-Transfer-Encoding: 8bit Injection-Date: Tue, 30 Jul 2024 01:21:12 +0200 (CEST) Injection-Info: dont-email.me; posting-host="54396cd3675252b2a65935bdbb96af3f"; logging-data="654112"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX19w0HBDozTgBef0moY8jeNtAixEUVSUZTQ=" User-Agent: Mozilla Thunderbird Cancel-Lock: sha1:JPfQ1t5NBZO9GjICFA3EY3Q/iSo= Content-Language: en-CA In-Reply-To: <v8943a$lrfd$1@dont-email.me> Bytes: 7482 On 2024-07-29 15:11, Chips Loral wrote: > Alan wrote: >> On 2024-07-29 04:23, Andrew wrote: >>> Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) : >>> >>>>> You not comprehending the difference between zero percent of Apple >>>>> reports >>>>> versus zero total convictions is how I know you zealots own >>>>> subnormal IQs. >>>> >>>> Not at all. My position hasn't changed. You, however, have had about >>>> three >>>> different positions on this thread and keep getting confused which one >>>> you're arguing for. lol. >>> >>> Au contraire >>> >>> Because I only think logically, my rather sensible position has never >>> changed, Chris, and the fact you "think" it has changed is simply >>> that you >>> don't know the difference between the percentage of convictions based on >>> the number of reports, and the total number of convictions. >>> >>> When you figure out that those two things are different, then (and only >>> then) will you realize I've maintained the same position throughout. >>> >>> Specifically.... >>> >>> a. If the Apple reporting rate is low, and yet if their conviction >>> rate is high (based on the number of reports), then they are NOT >>> underreporting images. >> >> Apple's reporting rate is ZERO, because they're not doing scanning of >> images of any kind. > > After getting caught. > > You can't seem to get ANYTHING right, Mac-troll: > > https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/ > > In August 2021, Apple announced a plan to scan photos that users stored > in iCloud for child sexual abuse material (CSAM). The tool was meant to > be privacy-preserving and allow the company to flag potentially > problematic and abusive content without revealing anything else. But the > initiative was controversial, and it soon drew widespread criticism from > privacy and security researchers and digital rights groups who were > concerned that the surveillance capability itself could be abused to > undermine the privacy and security of iCloud users around the world. At > the beginning of September 2021, Apple said it would pause the rollout > of the feature to “collect input and make improvements before releasing > these critically important child safety features.” In other words, a > launch was still coming. > > Parents and caregivers can opt into the protections through family > iCloud accounts. The features work in Siri, Apple’s Spotlight search, > and Safari Search to warn if someone is looking at or searching for > child sexual abuse materials and provide resources on the spot to report > the content and seek help. > > https://sneak.berlin/20230115/macos-scans-your-local-files-now/ > > Preface: I don’t use iCloud. I don’t use an Apple ID. I don’t use the > Mac App Store. I don’t store photos in the macOS “Photos” application, > even locally. I never opted in to Apple network services of any kind - I > use macOS software on Apple hardware. > > Today, I was browsing some local images in a subfolder of my Documents > folder, some HEIC files taken with an iPhone and copied to the Mac using > the Image Capture program (used for dumping photos from an iOS device > attached with an USB cable). > > I use a program called Little Snitch which alerts me to network traffic > attempted by the programs I use. I have all network access denied for a > lot of Apple OS-level apps because I’m not interested in transmitting > any of my data whatsoever to Apple over the network - mostly because > Apple turns over customer data on over 30,000 customers per year to US > federal police without any search warrant per Apple’s own self-published > transparency report. I’m good without any of that nonsense, thank you. > > Imagine my surprise when browsing these images in the Finder, Little > Snitch told me that macOS is now connecting to Apple APIs via a program > named mediaanalysisd (Media Analysis Daemon - a background process for > analyzing media files). > > ... > > > Integrate this data and remember it: macOS now contains network-based > spyware even with all Apple services disabled. It cannot be disabled via > controls within the OS: you must used third party network filtering > software (or external devices) to prevent it. > > This was observed on the current version of macOS, macOS Ventura 13.1. > 'A recent thread on Twitter raised concerns that the macOS process mediaanalysisd, which scans local photos, was secretly sending the results to an Apple server. This claim was made by a cybersecurity researcher named Jeffrey Paul. However, after conducting a thorough analysis of the process, it has been determined that this is not the case.' <https://pawisoon.medium.com/debunked-the-truth-about-mediaanalysisd-and-apples-access-to-your-local-photos-on-macos-a42215e713d1> 'The mediaanalysisd process is a background task that starts every time an image file is previewed in Finder, and then calls an Apple service. The process is designed to run machine learning algorithms to detect objects in photos and make object-based search possible in the Photos app. It also helps Finder to detect text and QR codes in photos. Even if a user does not use the Photos app or have an iCloud account, the process will still run.' Apple is not scanning your photos for CSAM