| Deutsch English Français Italiano |
|
<103r887$1fl6q$1@solani.org> View for Bookmarking (what is this?) Look up another Usenet article |
Path: nntp.eternal-september.org!news.eternal-september.org!eternal-september.org!feeder3.eternal-september.org!weretis.net!feeder8.news.weretis.net!reader5.news.weretis.net!news.solani.org!.POSTED!not-for-mail
From: Mild Shock <janburse@fastmail.fm>
Newsgroups: comp.lang.prolog
Subject: Missed the AI Boom because missed the Emojis (Was: Prolog totally
missed the AI Boom)
Date: Sun, 29 Jun 2025 13:32:25 +0200
Message-ID: <103r887$1fl6q$1@solani.org>
References: <vpceij$is1s$1@solani.org>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Sun, 29 Jun 2025 11:32:23 -0000 (UTC)
Injection-Info: solani.org;
logging-data="1561818"; mail-complaints-to="abuse@news.solani.org"
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:128.0) Gecko/20100101
Firefox/128.0 SeaMonkey/2.53.21
Cancel-Lock: sha1:hFOERQZGHC/JE0NoWRWxs+FRxSE=
X-User-ID: eJwFwYEBwDAEBMCVgvgwzqP2H6F3bhD0u3BcX9+IGs1XA9069oWx9cxbs10nmlw7rSFsUMeSFyFSxfyk8wdsxxZD
In-Reply-To: <vpceij$is1s$1@solani.org>
> Those that use a large part pay a pretty
> high price in terms of memory and currently
> also time for code points > 0xffff
Emojis are typically above 0xffff. And from this
announcement its seem, Emojis are a big part with
keeping up with the AI Boom:
> :rocket: Call for Papers: Integrating Logical
> Reasoning & Large Language Models (LLMs) :brain:
>
> https://swi-prolog.discourse.group/t/9065
But it would cost you nothing to support this here in library(portray_text):
/* SWI-Prolog 9.3.24 */
?- X = [a,b,c]
X = `abc`
It is extremly trivial to implement, its not really
rocket science. It doesn need much brains and
it works also for Emojis:
/* Scryer Prolog 0.9.4-411 */
?- X = [a,b,c].
X = "abc".
?- X = ['🚀', a, '🧠', b, c].
X = "🚀a🧠bc".
In Scryer Prolog it shows double quotes and not
back quotes, because of the different default settings
of the Prolog flags double_quotes and back_quotes.
Mild Shock schrieb:
>
> Inductive logic programming at 30
> https://arxiv.org/abs/2102.10556
>
> The paper contains not a single reference to autoencoders!
> Still they show this example:
>
> Fig. 1 ILP systems struggle with structured examples that
> exhibit observational noise. All three examples clearly
> spell the word "ILP", with some alterations: 3 noisy pixels,
> shifted and elongated letters. If we would be to learn a
> program that simply draws "ILP" in the middle of the picture,
> without noisy pixels and elongated letters, that would
> be a correct program.
>
> I guess ILP is 30 years behind the AI boom. An early autoencoder
> turned into transformer was already reported here (*):
>
> SERIAL ORDER, Michael I. Jordan - May 1986
> https://cseweb.ucsd.edu/~gary/PAPER-SUGGESTIONS/Jordan-TR-8604-OCRed.pdf
>
> Well ILP might have its merits, maybe we should not ask
> for a marriage of LLM and Prolog, but Autoencoders and ILP.
> But its tricky, I am still trying to decode the da Vinci code of
>
> things like stacked tensors, are they related to k-literal clauses?
> The paper I referenced is found in this excellent video:
>
> The Making of ChatGPT (35 Year History)
> https://www.youtube.com/watch?v=OFS90-FX6pg
>