Deutsch   English   Français   Italiano  
<vpcek7$is1s$2@solani.org>

View for Bookmarking (what is this?)
Look up another Usenet article

Path: ...!2.eu.feeder.erje.net!feeder.erje.net!weretis.net!feeder8.news.weretis.net!reader5.news.weretis.net!news.solani.org!.POSTED!not-for-mail
From: Mild Shock <janburse@fastmail.fm>
Newsgroups: sci.math
Subject: Prolog totally missed the AI Boom
Date: Sat, 22 Feb 2025 13:06:33 +0100
Message-ID: <vpcek7$is1s$2@solani.org>
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8; format=flowed
Content-Transfer-Encoding: 7bit
Injection-Date: Sat, 22 Feb 2025 12:06:31 -0000 (UTC)
Injection-Info: solani.org;
	logging-data="618556"; mail-complaints-to="abuse@news.solani.org"
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:128.0) Gecko/20100101
 Firefox/128.0 SeaMonkey/2.53.20
Cancel-Lock: sha1:7bMAS8GYiVpW6q4q/jNVK0YpdmM=
X-Mozilla-News-Host: news://news.solani.org:119
X-User-ID: eJwFwQkBwDAIA0BL5QurnADFv4TdhUHQ6Qh4bCwohI31BoYqt7M4tXrfM8+7OpqHjiPGkuK531k2oSW++wNqxxY0
Bytes: 2141
Lines: 29


Inductive logic programming at 30
https://arxiv.org/abs/2102.10556

The paper contains not a single reference to autoencoders!
Still they show this example:

Fig. 1 ILP systems struggle with structured examples that
exhibit observational noise. All three examples clearly
spell the word "ILP", with some alterations: 3 noisy pixels,
shifted and elongated letters. If we would be to learn a
program that simply draws "ILP" in the middle of the picture,
without noisy pixels and elongated letters, that would
be a correct program.

I guess ILP is 30 years behind the AI boom. An early autoencoder
turned into transformer was already reported here (*):

SERIAL ORDER, Michael I. Jordan - May 1986
https://cseweb.ucsd.edu/~gary/PAPER-SUGGESTIONS/Jordan-TR-8604-OCRed.pdf

Well ILP might have its merits, maybe we should not ask
for a marriage of LLM and Prolog, but Autoencoders and ILP.
But its tricky, I am still trying to decode the da Vinci code of

things like stacked tensors, are they related to k-literal clauses?
The paper I referenced is found in this excellent video:

The Making of ChatGPT (35 Year History)
https://www.youtube.com/watch?v=OFS90-FX6pg