Deutsch   English   Français   Italiano  
<18CdneEMb7os6Pv6nZ2dnZfqn_idnZ2d@earthlink.com>

View for Bookmarking (what is this?)
Look up another Usenet article

Path: ...!Xl.tags.giganews.com!local-3.nntp.ord.giganews.com!nntp.earthlink.com!news.earthlink.com.POSTED!not-for-mail
NNTP-Posting-Date: Sat, 21 Dec 2024 08:00:49 +0000
Subject: Re: Remember "Bit-Slice" Chips ?
Newsgroups: comp.os.linux.misc
References: <o4ucnYo2YLqmZ876nZ2dnZfqn_adnZ2d@earthlink.com>
 <vjh5j0$3btea$13@dont-email.me>
 <6650dffd-d5e8-0d72-0a01-ba62815a6667@example.net>
 <vjk1ea$nd7$5@dont-email.me>
 <057f7ff0-2fda-0431-2ef8-e860a4772b69@example.net>
 <vjkknu$435q$2@dont-email.me>
 <3a439b82-71cc-6aff-65dd-630c0707ff3f@example.net>
 <ls6sfjFt2anU5@mid.individual.net>
 <0d5d463f-af08-46aa-97e3-ef251ba64cc4@example.net>
 <ls8m7fF7o5dU2@mid.individual.net>
 <a0ee6a97-3650-f78f-c9cc-fa4bac543655@example.net>
 <ls9mprFcabuU5@mid.individual.net>
 <451210c3-9b3d-91f1-be43-d06211f3b30f@example.net>
 <lsbhaoFn55gU1@mid.individual.net>
 <812b41ff-53e1-48d3-8088-d186fa65d90a@example.net>
 <lse8dpF5ikfU7@mid.individual.net>
 <fea6ae4f-5fe5-8120-2586-88e4b1d570be@example.net>
 <UQKdnUKJir8UNf76nZ2dnZfqnPednZ2d@earthlink.com>
 <b5592fc5-3197-31cf-ac59-8a44e7db1ea3@example.net>
 <SLicnU51cs7fkvj6nZ2dnZfqnPSdnZ2d@earthlink.com>
 <lsln9nFbe1iU1@mid.individual.net>
 <eb9e5eb7-e1df-4b40-e7fe-56a36e7fe27f@example.net>
From: "186282@ud0s4.net" <186283@ud0s4.net>
Organization: wokiesux
Date: Sat, 21 Dec 2024 03:00:48 -0500
User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101
 Thunderbird/78.13.0
MIME-Version: 1.0
In-Reply-To: <eb9e5eb7-e1df-4b40-e7fe-56a36e7fe27f@example.net>
Content-Type: text/plain; charset=utf-8; format=flowed
Content-Language: en-US
Content-Transfer-Encoding: 8bit
Message-ID: <18CdneEMb7os6Pv6nZ2dnZfqn_idnZ2d@earthlink.com>
Lines: 72
X-Usenet-Provider: http://www.giganews.com
NNTP-Posting-Host: 99.101.150.97
X-Trace: sv3-Ob6/89/Q6bUZuxnJU4PHgrVTzh4W1fsda/Hlc+iOahueGzsBEOvU4dhRFxooHgilE4TYBKTPcs0C5hd!iiCGQ8mEs3RDNBswHAQJhaOXarY8iLET7gWnETjP5Y2T6SpJegnRlg+JubAjjAOX6BjS6iHl+Mkc!AS+ZvOujs/CBSDiactZe
X-Abuse-and-DMCA-Info: Please be sure to forward a copy of ALL headers
X-Abuse-and-DMCA-Info: Otherwise we will be unable to process your complaint properly
X-Postfilter: 1.3.40
Bytes: 5271

On 12/20/24 4:21 PM, D wrote:
> 
> 
> On Fri, 20 Dec 2024, rbowman wrote:
> 
>> On Fri, 20 Dec 2024 01:31:30 -0500, 186282@ud0s4.net wrote:
>>
>>>    It's the 'hand-wave' thing that sunk the first AI paradigm.
>>>    Marv Minsky (who posted on usenet for awhile) and friends saw how
>>>    easily 'decisions' could be done with a transistor or two and assumed
>>>    it would thus be easy to build an AI. AC Clarke used the Minsky
>>>    optimism when fashioning the idea of "HAL".
>>
>> Minsky threw a wrench in the works with his 9169 'Perceptrons'. He had
>> tried to implement B. F. skinner's operant condition with a analog lashup
>> that sort of worked if the vacuum tubes didn't burn out. Rosenblatt has
>> built a 'Perceptron' and Minsky pointed out original design couldn't
>> handle an XOR. That sent research down another rabbit hole.
>>
>> By the '80s the original perceptron had evolved into a multilayer network
>> train by back propagation. When I played around with it 'Parallel
>> Distributed Processing' by Rumelhart and McClelland was THE book.
>>
>> https://direct.mit.edu/books/monograph/4424/Parallel-Distributed-
>> Processing-Volume
>>
>> The ideas were fascinating but the computing power wasn't there. Most of
>> what I learned then is still relevant to TensorFlow and the other neural
>> network approaches except now there are the $30,000 Nvidia GPUs to do the
>> heavy lifting.
>>
>> The '80s neural networks weren't practical so the focus shifted to expert
>> systems until they petered out. The boom and bust cycles led to the term
>> 'AI Winter'
>>
>> https://www.techtarget.com/searchenterpriseai/definition/AI-winter
>>
>> I think something worthwhile will come from this cycle but ultimately it
>> won't be the LLMs that are getting all the hype.
> 
> I wonder if not facebook, open sourcing their llm threw quite a wrench 
> in the Open AI machinery this time.
> 
> Open AI:s ai is stagnating, and I think perhaps the development of the 
> open source models will be good enough so that open ai might not be able 
> to recoup all the massive amounts of money that has been invested in them.
> 
> Then another ai winter, and after that, our dear llms might be ready for 
> prime time!


   I think what LLMs do is a PART of 'intelligence/self',
   just not ALL of it. OTHER methods/layers maybe CAN
   be spliced in to fill the weak bits.

   Brains are an evolutionary hodgepodge - 'whatever was
   needed/worked'. 600+ million years of field testing.

   Kinda amazed they work at all. There's also a weird,
   almost 'holographic', nature to them - some of those
   kids blasted by hydrocephalus, with little grey
   matter left, still managed average or even a bit
   above average IQs. It's the same with 'cerebral
   palsy' cases. They STILL produce a 'person' in
   there. The System WANTS to work.

   Anyway, LLMs contaminated by some NN action and
   e-motions and maybe a few other odd bits ... and
   don't forget to include those hard-wired evolutionary
   sub-routines.

   TIGER !!!