Path: ...!eternal-september.org!feeder2.eternal-september.org!i2pn.org!i2pn2.org!.POSTED!not-for-mail From: tomyee3@gmail.com (ProkaryoticCaspaseHomolog) Newsgroups: sci.physics.relativity Subject: Re: XhatGPT; How many physicists accept =?UTF-8?B?R1I/?= Date: Sat, 16 Nov 2024 07:44:17 +0000 Organization: novaBBS Message-ID: References: MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8; format=flowed Content-Transfer-Encoding: 8bit Injection-Info: i2pn2.org; logging-data="2696228"; mail-complaints-to="usenet@i2pn2.org"; posting-account="Ooch2ht+q3xfrepY75FKkEEx2SPWDQTvfft66HacveI"; User-Agent: Rocksolid Light X-Spam-Checker-Version: SpamAssassin 4.0.0 X-Rslight-Site: $2y$10$oODwrlMzh1Bp6.EIDxRWNO.29BJ8nZwtK9fbyriPMCHUt8mKApLxO X-Rslight-Posting-User: 504a4e36a1e6a0679da537f565a179f60d7acbd8 Bytes: 2008 Lines: 25 On Sat, 16 Nov 2024 4:54:33 +0000, Sylvia Else wrote: > On 16-Nov-24 9:52 am, rhertz wrote: >> ChatGPT entered in crisis here, after I asked HOW MANY (worldwide). >> >> > > You realise that it's just a language model based on trawling the > Internet? > > It's not intelligent. I doesn't know anything. It cannot reason. It just > composes sentences based on word probabilities derived from the > trawling. > > And guess what? The Internet contains a lot of garbage; garbage that's > been fed into the language model. ...and an increasingly large proportion of the garbage being fed into the large language models is garbage GENERATED by large language models. The "Mad Cow Disease" crisis of the 1980s is believed to have been due to the practice of feeding cattle meal that contained cattle and sheep by-products. As LLM output becomes increasingly difficult to distinguish from human output (which is often bad enough!), I predict an outbreak of "Mad LLM Disease".