Deutsch   English   Français   Italiano  
<103bos1$164mt$1@solani.org>

View for Bookmarking (what is this?)
Look up another Usenet article

Path: news.eternal-september.org!eternal-september.org!feeder3.eternal-september.org!news.swapon.de!weretis.net!feeder8.news.weretis.net!reader5.news.weretis.net!news.solani.org!.POSTED!not-for-mail
From: Mild Shock <janburse@fastmail.fm>
Newsgroups: comp.lang.prolog
Subject: No Coders completely Brain Dead (Was: Prolog totally missed the AI
 Boom)
Date: Mon, 23 Jun 2025 16:37:54 +0200
Message-ID: <103bos1$164mt$1@solani.org>
References: <vpceij$is1s$1@solani.org>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 8bit
Injection-Date: Mon, 23 Jun 2025 14:37:53 -0000 (UTC)
Injection-Info: solani.org;
	logging-data="1250013"; mail-complaints-to="abuse@news.solani.org"
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:128.0) Gecko/20100101
 Firefox/128.0 SeaMonkey/2.53.21
Cancel-Lock: sha1:l33c1AVSrOiRgv/8FuqyXlk1hjQ=
In-Reply-To: <vpceij$is1s$1@solani.org>
X-User-ID: eJwNysEBwCAIA8CVQJNgxwGV/Uew9z5OuXZAFNhstMmKPh1ajbx5ToHhF5Gi+Yidiv23Op23lwZhUVVcn48HSmoVVg==

Concerning library(portray_text) which is in limbo:

 > Libraries are (often) written for either
and thus the libraries make the choice.

But who writes these libraries? The SWI Prolog
community. And who doesn’t improve these libraries,
instead floods the web with workaround tips?
The SWI Prolog community.

Conclusion the SWI-Prolog community has itself
trapped in an ancient status quo, creating an island.
Cannot improve its own tooling, is not willing
to support code from else where that uses chars.

Same with the missed AI Boom.

(*) Code from elsewhere is dangerous, People
might use other Prolog systems than only SWI-Prolog,
like for exampe Trealla Prolog and Scryer Prolog.

(**) Keeping the status quo is comfy. No need to
think in terms of programm code. Its like biology
teachers versus pathology staff, biology teachers
do not everyday see opened corpses.


Mild Shock schrieb:
> 
> Inductive logic programming at 30
> https://arxiv.org/abs/2102.10556
> 
> The paper contains not a single reference to autoencoders!
> Still they show this example:
> 
> Fig. 1 ILP systems struggle with structured examples that
> exhibit observational noise. All three examples clearly
> spell the word "ILP", with some alterations: 3 noisy pixels,
> shifted and elongated letters. If we would be to learn a
> program that simply draws "ILP" in the middle of the picture,
> without noisy pixels and elongated letters, that would
> be a correct program.
> 
> I guess ILP is 30 years behind the AI boom. An early autoencoder
> turned into transformer was already reported here (*):
> 
> SERIAL ORDER, Michael I. Jordan - May 1986
> https://cseweb.ucsd.edu/~gary/PAPER-SUGGESTIONS/Jordan-TR-8604-OCRed.pdf
> 
> Well ILP might have its merits, maybe we should not ask
> for a marriage of LLM and Prolog, but Autoencoders and ILP.
> But its tricky, I am still trying to decode the da Vinci code of
> 
> things like stacked tensors, are they related to k-literal clauses?
> The paper I referenced is found in this excellent video:
> 
> The Making of ChatGPT (35 Year History)
> https://www.youtube.com/watch?v=OFS90-FX6pg
>