Deutsch   English   Français   Italiano  
<m4hbjnF5fa3U3@mid.individual.net>

View for Bookmarking (what is this?)
Look up another Usenet article

Path: news.eternal-september.org!eternal-september.org!feeder3.eternal-september.org!fu-berlin.de!uni-berlin.de!individual.net!not-for-mail
From: rbowman <bowman@montana.com>
Newsgroups: alt.comp.os.windows-11,comp.os.linux.advocacy
Subject: Re: Windows-on-ARM Laptop Is A
 =?UTF-8?B?4oCcRnJlcXVlbnRseS1SZXR1cm5lZCBJdGVt4oCd?= On Amazon
Date: 26 Mar 2025 03:46:00 GMT
Lines: 36
Message-ID: <m4hbjnF5fa3U3@mid.individual.net>
References: <vrnbks$rkfk$1@dont-email.me> <vrnqf9$18esr$1@dont-email.me>
	<WYRDP.1210933$_N6e.547203@fx17.iad>
	<a9rvtjtpp62h8hihlc3b9mmlbbf03nm885@4ax.com> <vrpmac$315f4$1@dont-email.me>
	<vrq9e1$3klvh$1@dont-email.me> <m4cc06FbfhrU5@mid.individual.net>
	<vrrsuk$15shc$1@dont-email.me> <vrs5t9$1dqfo$1@dont-email.me>
	<m4eeltFktmlU8@mid.individual.net> <vrtpb7$2u0fo$1@dont-email.me>
Mime-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
X-Trace: individual.net yu+t6H9z/QzuKPvju/lQ7wCm8I0sHg1KDJpRDl4zxwJ9NPCOnf
Cancel-Lock: sha1:ptewtT8yKCAhLNrM/ykkvThTuCU= sha256:NsaKDBMwXq7olZ0huFugz8EDnethCN5Pnncm9qEi9fI=
User-Agent: Pan/0.160 (Toresk; )

On Tue, 25 Mar 2025 08:26:15 -0000 (UTC), Chris wrote:


> Yes, CUDA is the dominant interface, but not the only game in town.
> There are other NPUs that can give nVidia a run for its money.

I've seen talk of opening up the CUDA API but I expect to be snowshoeing 
in hell first. OpenCL isn't ready for prime time yet.

> Sure, but there's a whole spectrum of needs for deep learning methods
> that are far more modest and still very useful.

That's where my interests lie, edge ML applications, not the whole hyped 
up LLM deal. 

> Machine learning has been around since the 1960s and has had real world
> uses for a lot of that time.

That's a rather fluid term and if you count Hebb, since the '40s. I found 
the concepts interesting in the '60s in the context of neurophysiology and 
revited it in the '80s when Rumelhart and McClelland's book came out and 
back propagation was introduced. The concepts were there but the computing 
power wasn't.

Neural networks were over-promised and became a career killer and expert 
systems became the stars. That didn't work out as planned, so move on to 
fuzzy logic and so forth.  Then neural networks were reborn but people 
didn't want to call them that.


> I created my first model in 2006/7 with no need for a GPU.

So have I with very small datasets like MNIST. No need except if you 
wanted to measure the epochs with something other than a wall clock.