Deutsch   English   Français   Italiano  
<ve3e50$5j97$3@solani.org>

View for Bookmarking (what is this?)
Look up another Usenet article

Path: eternal-september.org!news.eternal-september.org!feeder3.eternal-september.org!weretis.net!feeder8.news.weretis.net!reader5.news.weretis.net!news.solani.org!.POSTED!not-for-mail
From: Mild Shock <janburse@fastmail.fm>
Newsgroups: sci.logic
Subject: Microsoft is plagiarizing my Invention [LLMs under the hood]
Date: Tue, 8 Oct 2024 16:04:18 +0200
Message-ID: <ve3e50$5j97$3@solani.org>
References: <v67685$6fr5$1@solani.org> <vdlqr2$12s58$1@solani.org>
 <vdlsa4$12sv7$1@solani.org>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Date: Tue, 8 Oct 2024 14:04:16 -0000 (UTC)
Injection-Info: solani.org;
	logging-data="183591"; mail-complaints-to="abuse@news.solani.org"
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:91.0) Gecko/20100101
 Firefox/91.0 SeaMonkey/2.53.19
Cancel-Lock: sha1:RcN9e9hUNt6sr6ZqElR4OUzC8rQ=
In-Reply-To: <vdlsa4$12sv7$1@solani.org>
X-User-ID: eJwFwQkBwDAIA0BLpXxBDmHFv4TduYbEpIWH+fp290On41iZipxe8VEsb+vHaUhd55HUeKzLRHLAnSco4w9VzhW4

will probably never get a Turing Award or something
for what I did 23 years ago. Why is its reading
count on research gate suddently going up?

Knowledge, Planning and Language,
November 2001

I guess because of this, the same topic takled by
Microsofts recent model GRIN. Shit. I really should
find some investor and pump up a start up!

"Mixture-of-Experts (MoE) models scale more
effectively than dense models due to sparse
computation through expert routing, selectively
activating only a small subset of expert modules."
https://arxiv.org/pdf/2409.12136

But somehow I am happy with my dolce vita as
it is now... Or maybe I am decepting myself?

P.S.: From the GRIN paper, here you see how
expert domains modules relate with each other:

Figure 6 (b): MoE Routing distribution similarity
across MMLU 57 tasks for the control recipe.