Deutsch   English   Français   Italiano  
<v5g87h$25m9e$1@solani.org>

View for Bookmarking (what is this?)
Look up another Usenet article

Path: ...!feeds.phibee-telecom.net!weretis.net!feeder8.news.weretis.net!reader5.news.weretis.net!news.solani.org!.POSTED!not-for-mail
From: Jan Panteltje <alien@comet.invalid>
Newsgroups: sci.electronics.design
Subject: no more matrix multiplication needed in LLMs?
Date: Wed, 26 Jun 2024 05:16:32 GMT
Message-ID: <v5g87h$25m9e$1@solani.org>
MIME-Version: 1.0
Content-Type: text/plain; ISO-8859-15
Content-Transfer-Encoding: 8bit
Injection-Date: Wed, 26 Jun 2024 05:16:33 -0000 (UTC)
Injection-Info: solani.org;
	logging-data="2283822"; mail-complaints-to="abuse@news.solani.org"
User-Agent: NewsFleX-1.5.7.5 (Linux-5.15.32-v7l+)
Cancel-Lock: sha1:BDwKlPGaEqJ1GLdNQQ6WlwY0aH4=
X-User-ID: eJwNysEBADEEBMCWBIuUg9B/CXfzHogda1eDKRa7JbW3F50Jl0rMtHjQS/6H3CfL0GGqpiCfnOjDU+KkbPUBfx0WEQ==
X-Newsreader-location: NewsFleX-1.5.7.5 (c) 'LIGHTSPEED' off line news reader for the Linux platform
 NewsFleX homepage: http://www.panteltje.nl/panteltje/newsflex/ and ftp download ftp://sunsite.unc.edu/pub/linux/system/news/readers/ 
Bytes: 1434
Lines: 6

Researchers upend AI status quo by eliminating matrix multiplication in LLMs
Running AI models without matrix math means far less power consumption—and fewer GPUs?
 https://arstechnica.com/information-technology/2024/06/researchers-upend-ai-status-quo-by-eliminating-matrix-multiplication-in-llms/

So, low power and cheaper AI?
Bye bye Nvidia?