Deutsch English Français Italiano |
<v88sjt$hbk9$1@solani.org> View for Bookmarking (what is this?) Look up another Usenet article |
Path: ...!news.mixmin.net!weretis.net!feeder8.news.weretis.net!reader5.news.weretis.net!news.solani.org!.POSTED!not-for-mail From: badgolferman <REMOVETHISbadgolferman@gmail.com> Newsgroups: misc.phone.mobile.iphone Subject: Re: Apple accused of underreporting suspected CSAM on its platforms Date: Mon, 29 Jul 2024 20:04:13 -0000 (UTC) Message-ID: <v88sjt$hbk9$1@solani.org> References: <v7mup4$7vpf$1@solani.org> <lg8ea1Fa94U1@mid.individual.net> <xn0oonlp4azqw16000@reader443.eternal-september.org> <lga2k1F7uk8U1@mid.individual.net> <xn0oonrftb7hazk002@reader443.eternal-september.org> <v7olut$19iie$1@dont-email.me> <lga8vfF8qq0U3@mid.individual.net> <v7q9vj$1l9co$1@dont-email.me> <v7qn3b$2hg0$1@nnrp.usenet.blueworldhosting.com> <v7rclq$1r24r$2@dont-email.me> <lgdac6F3c6aU4@mid.individual.net> <v80bju$2s7ns$2@dont-email.me> <lgi05eFq6vhU2@mid.individual.net> <v80j93$2nqsm$5@dont-email.me> <lgildlFtal2U1@mid.individual.net> <v85r6s$mgr$1@dont-email.me> <lgni5gFl7shU3@mid.individual.net> <v87gf2$d1eu$1@dont-email.me> <lgpqjfF17veU1@mid.individual.net> <v88jf8$j1j9$1@dont-email.me> MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Injection-Date: Mon, 29 Jul 2024 20:04:13 -0000 (UTC) Injection-Info: solani.org; logging-data="568969"; mail-complaints-to="abuse@news.solani.org" User-Agent: NewsTap/5.5 (iPhone/iPod Touch) Cancel-Lock: sha1:VBocPxkknylZZWfViKBfFPas9dA= sha1:XI4uvLr/aZ7FnF8BF94qwdarHO8= X-User-ID: eJwNyskBwCAMA7CVCDjXOGDw/iO0estXWDARHnC5Xt1j1u5ZgxoLNC5EgJz2356zWEo2sFN31tt+1NIweeEDOhkVFA== Bytes: 2355 Lines: 15 Chris <ithinkiam@gmail.com> wrote: > Jolly Roger <jollyroger@pobox.com> wrote: >> On 2024-07-29, Chris <ithinkiam@gmail.com> wrote: >> >> Actually, a human being does review it with Google's system: > > I was unclear. I'm not saying a human doesn't review, I'm saying that given > the dozens/hundreds of images of suspected abuse images they review a day > they won't have the ability to make informed decisions. > >> --- >> A human content moderator for Google would have reviewed the photos >> after they were flagged by the artificial intelligence to confirm they >> met the federal definition of child sexual abuse material. What kind of a person would want this job?