Deutsch   English   Français   Italiano  
<uv6dq9$122g1$1@dont-email.me>

View for Bookmarking (what is this?)
Look up another Usenet article

Path: ...!eternal-september.org!feeder3.eternal-september.org!news.eternal-september.org!.POSTED!not-for-mail
From: -hh <recscuba_google@huntzinger.com>
Newsgroups: comp.os.linux.advocacy
Subject: Re: GNU/Linux is the Empowerment of the PC. So What Do You Do?
Date: Wed, 10 Apr 2024 12:12:57 -0400
Organization: A noiseless patient Spider
Lines: 42
Message-ID: <uv6dq9$122g1$1@dont-email.me>
References: <17c37c35720e7ddc$20295$3326957$802601b3@news.usenetexpress.com>
 <uurc4n$1vjvo$1@dont-email.me> <uusdii$2848h$1@dont-email.me>
 <rge51j1uctfqpeg318tbnb0oevs9ck3j6u@4ax.com>
 <17c40c6de0c879aa$405$1351842$802601b3@news.usenetexpress.com>
 <uuun8r$9j84$1@solani.org>
 <17c417d563439fed$44$3121036$802601b3@news.usenetexpress.com>
 <uuvhaq$921c$1@solani.org>
 <17c43fa6cd7d5d2d$59197$3716115$802601b3@news.usenetexpress.com>
 <uv15ob$au8o$1@solani.org>
 <17c46a7f9e9e2dcb$3880$111488$802601b3@news.usenetexpress.com>
 <l7jf72F9s43U2@mid.individual.net> <uv3clk$7ecf$2@dont-email.me>
 <l7l8bsFi2mtU3@mid.individual.net> <uv4bpk$fa07$3@dont-email.me>
 <l7m76dFmp2rU1@mid.individual.net>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Date: Wed, 10 Apr 2024 16:12:57 -0000 (UTC)
Injection-Info: dont-email.me; posting-host="48368c3df05f2ee6d2a9a90de7fb47e2";
	logging-data="1116673"; mail-complaints-to="abuse@eternal-september.org";	posting-account="U2FsdGVkX18oR7ou7tY+kntftaUWWnqYKB3CUrIC8bc="
User-Agent: Mozilla Thunderbird
Cancel-Lock: sha1:3ho2aQK40NxRsGfDZh2+3UUECvc=
Content-Language: en-US
In-Reply-To: <l7m76dFmp2rU1@mid.individual.net>
Bytes: 3796

On 4/9/24 9:09 PM, rbowman wrote:
> On Tue, 9 Apr 2024 17:26:12 -0400, Chris Ahlstrom wrote:
> 
>> I've seen some interesting books about neural logic circuits, but circa
>> 1980.
> 
> Rumelhart and McClelland's 'Parallel Distributed Processing' was the text
> used in a seminar I took that summed up the state of the art. The
> perceptron went back to McColloch and Pitts in 1944. It was a start but
> had problems some of which were pointed out by Minsky. Hopfield came up
> with the network named after him. Rumelhart and Hinton threw in back
> propagation based on gradient descent in an '86 paper.
> 
> Anyway, read the current literature and you'll see a lot of what was
> discussed in the '80s. The problem was hardware. Nvidia is making big
> bucks producing $40,000+ GPUs that can handle all the matrix operations
> involved. That sort of power wasn't available in the '80s outside of a few
> supercomputers and even they might have been breathing heavy.
> 
> The other problem was the inevitable hype that oversold NNs.
> 
> https://www.skynettoday.com/overviews/neural-net-history
> 
> It may be tl;dr but if you're interested that covers the history very
> well. It's telling that 'neural network' became a little toxic amid the
> stumbles over the years and became 'machine learning'. Having watched the
> technology waxing and waning for close to 40 years I don't know if the
> current boom will become a bust.

My personal opinion on it that its currently over-hyped.

I've had some ML projects and my general conclusion is that it is 
essentially statistically-based rules, it is a good enough solution most 
of the time, but it depends on the nature of the task and can be quite 
quickly limited by its training set..especially if there's risks of 
over-training.  The ramifications are that its a good short/mid term fit 
for some activities, but really complex stuff (side eye at Tesla) are 
probably still a decade away, which also means that in the shortsighted 
vision of the Stock Market, they're going to lose interest in 2-3 years.

-hh