Deutsch   English   Français   Italiano  
<67871cf6$9$16$882e4bbb@reader.netnews.com>

View for Bookmarking (what is this?)
Look up another Usenet article

Path: news.eternal-september.org!eternal-september.org!feeder3.eternal-september.org!border-3.nntp.ord.giganews.com!border-1.nntp.ord.giganews.com!nntp.giganews.com!news-out.netnews.com!postmaster.netnews.com!eu1.netnews.com!not-for-mail
X-Trace: DXC=^[kGQbCjD@Q@KB@9c:8AO_HWonT5<]0T]djI?Uho:Xe[Leo?9o4lXdQT1Ll`o^7GW][el]69E]KbZH6QbELj<9E_T6;GA05OJGSOXeKkS2`?jY
X-Complaints-To: support@blocknews.net
Newsgroups: misc.news.internet.discuss
From: Retrograde <fungus@amongus.com.invalid>
Subject: Re: Human data for AI exhausted
References: <67833ee2$10$17$882e4bbb@reader.netnews.com>
 <vm1n0h$1fnhf$1@dont-email.me> <87frllyrtl.fsf@sonera.fi>
 <87wmext2it.fsf@enoch.nodomain.nowhere>
Reply-To: fungus@amongus.com.invalid
X-Face: B,ckSl,FpK$Tw&Gx_oee5Tcj|RCK=sbQ=a&cJ9)e*A|.f}uctF}Rohq&$BI&OBVck/zSV
 DV s<~Tu)q"Z]^2KikYTfy^bh'9MsB'ObTszVRGI_#zXVB\_B4BE~|Ad
User-Agent: slrn/1.0.3 (Linux)
Date: 15 Jan 2025 02:27:03 GMT
Lines: 31
Message-ID: <67871cf6$9$16$882e4bbb@reader.netnews.com>
NNTP-Posting-Host: 127.0.0.1
X-Trace: 1736908023 reader.netnews.com 16 127.0.0.1:51705

On 2025-01-14, Mike Spencer <mds@bogus.nodomain.nowhere> wrote:
> In the same metaphorical way that a corporation, if seen or treated as
> a person, is legally mandated to be a psychopath, current AIs --
> "generative large language models" in the jargon of the trade -- are
> designed to construct apparently knowledgeable assertions from
> detecting patterns in a vast corpus of text and present it with
> confidence. Of course corporations don't have neurally generated
> personalities to suffer from " antisocial personality disorder" [1].
> Nor do GLGMs have a body of knowledge, expertise or wisdom from which
> their assertions emerge.  Neither do they have an internal *belief*
> that they *do* have a superior "body of knowledge, expertise or
> wisdom" that defines the Dunning-Kruger effect.  But their excellent
> grammar and extensive vocabulary readily influence the credulous to
> infer that nonexistent "knowledge, expertise or wisdom". [2]

Very, very well said, thank you.

I've been thinking recently that the term 'artificial INTELLIGENCE' is a
marketing trick, similar to somehow convincing markets that CLOUD
services are anything other than renting someone else's server.

There's no intelligence to them, but by using (coopting? employing?) the
term, it implies presence of something that is not there.

The ruse may not have been willful, but it has been effective.

If they had called this new technology "advanced pattern matching
repetition" we would not be throwing gobs of money at it.

I look forward to the whole industry cratering, and the likes of Sam
Altman being run off with burning torches and pitchforks.