Deutsch   English   Français   Italiano  
<vrrsuk$15shc$1@dont-email.me>

View for Bookmarking (what is this?)
Look up another Usenet article

Path: ...!eternal-september.org!feeder3.eternal-september.org!news.eternal-september.org!eternal-september.org!.POSTED!not-for-mail
From: Paul <nospam@needed.invalid>
Newsgroups: alt.comp.os.windows-11,comp.os.linux.advocacy
Subject: =?UTF-8?Q?Re:_Windows-on-ARM_Laptop_Is_A_=e2=80=9cFrequently-Return?=
 =?UTF-8?Q?ed_Item=e2=80=9d_On_Amazon?=
Date: Mon, 24 Mar 2025 11:15:30 -0400
Organization: A noiseless patient Spider
Lines: 32
Message-ID: <vrrsuk$15shc$1@dont-email.me>
References: <vrnbks$rkfk$1@dont-email.me> <vrnqf9$18esr$1@dont-email.me>
 <WYRDP.1210933$_N6e.547203@fx17.iad>
 <a9rvtjtpp62h8hihlc3b9mmlbbf03nm885@4ax.com> <vrpmac$315f4$1@dont-email.me>
 <vrq9e1$3klvh$1@dont-email.me> <m4cc06FbfhrU5@mid.individual.net>
MIME-Version: 1.0
Content-Type: text/plain; charset=utf-8
Content-Transfer-Encoding: 7bit
Injection-Date: Mon, 24 Mar 2025 16:15:37 +0100 (CET)
Injection-Info: dont-email.me; posting-host="f8f384255b80efb1876a8c5422bc608b";
	logging-data="1241644"; mail-complaints-to="abuse@eternal-september.org";	posting-account="U2FsdGVkX1/rasjdP/KILUf/BMAURGgMWBUrRPtCURE="
User-Agent: Ratcatcher/2.0.0.25 (Windows/20130802)
Cancel-Lock: sha1:S3n02EgkxjBIasCkmPBIkoBpIog=
Content-Language: en-US
In-Reply-To: <m4cc06FbfhrU5@mid.individual.net>
Bytes: 2458

On Mon, 3/24/2025 2:21 AM, rbowman wrote:
> On Mon, 24 Mar 2025 00:36:17 -0000 (UTC), Chris wrote:
> 
>> Paul <nospam@needed.invalid> wrote:
>>> On Sun, 3/23/2025 7:17 AM, Joel wrote:
>>>>
>>>> It's clear why Microsoft would use x86 emulation with ARM, countless
>>>> reasons, but who cares about their Copilot bullshit, put Linux for ARM
>>>> on that mother fucker.
>>>
>>> Some day, you'll be able to run an AI locally.
>>
>> You can. Have a look at Ollama. Totally local and open source. Works
>> well too!
> 
> Training and inference are two different things. Other than toy datasets I 
> doubt much training will happen locally. 

Realistically, I think it's going to be quite a while,
if ever, before we can put together a decent box for inference.

In this gold rush, all the excess profit is in "mules and shovels".
A mule I was looking at today, the price is $8500 or so.

This kind of pricing, is hardly encouraging.

It would be cheaper, to build a wooden box, and
put a midget inside the box, and have it answer
questions. Can anyone give me a price on a
PhD class midget ?

   Paul