Path: ...!eternal-september.org!feeder3.eternal-september.org!news.eternal-september.org!.POSTED!not-for-mail From: Alan Newsgroups: misc.phone.mobile.iphone Subject: Re: Apple accused of underreporting suspected CSAM on its platforms Date: Mon, 29 Jul 2024 13:40:14 -0700 Organization: A noiseless patient Spider Lines: 26 Message-ID: References: MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8; format=flowed Content-Transfer-Encoding: 7bit Injection-Date: Mon, 29 Jul 2024 22:40:15 +0200 (CEST) Injection-Info: dont-email.me; posting-host="3eb9958d5eea0c84948681d44f52d8dc"; logging-data="659232"; mail-complaints-to="abuse@eternal-september.org"; posting-account="U2FsdGVkX1/vbe/uggu4neABwem63mvuQ9LarAA9ss4=" User-Agent: Mozilla Thunderbird Cancel-Lock: sha1:FJER1A06FPCYHeZNDifXJ9wiGFc= Content-Language: en-CA In-Reply-To: Bytes: 2780 On 2024-07-29 13:38, Chris wrote: > On 29/07/2024 21:04, badgolferman wrote: >> Chris wrote: >>> Jolly Roger wrote: >>>> On 2024-07-29, Chris wrote: >>>> >>>> Actually, a human being does review it with Google's system: >>> >>> I was unclear. I'm not saying a human doesn't review, I'm saying that >>> given >>> the dozens/hundreds of images of suspected abuse images they review a >>> day >>> they won't have the ability to make informed decisions. >>> >>>> --- >>>> A human content moderator for Google would have reviewed the photos >>>> after they were flagged by the artificial intelligence to confirm they >>>> met the federal definition of child sexual abuse material. >> >> What kind of a person would want this job? > > I read an article a couple of years ago on the Facebook content > moderators. Many ended up traumatised and got no support. God it was a > grim read. I think I recall that as well.