zlacker

Can logic programming be liberated from predicates and backtracking? [pdf]

submitted by matt_d+(OP) on 2024-10-12 04:58:30 | 197 points 97 comments
[view article] [source] [go to bottom]

NOTE: showing posts with links only show all posts
2. xelxeb+6i2[view] [source] 2024-10-13 05:17:50
>>matt_d+(OP)
Man, lately, I feel like this stuff has been following me around. I'd really like to deep-dive into logic programming and related paradigms. Just recently came across Answer Set Programming[0] (via Potassco's clingo[1]), and it has made me realize just how ignorant I am of the design space that's being explored here.

More personally, I recently spent enough time with first Scheme and then APL that the paradigms clicked for me, and the effect that had on the entirety of my outlook on work was dramatically changed as a result. For whatever reason, I feel like breaking down my ingrained technical paradigms has allowed me to integrate and strengthen my soft skills.

Plus, mind-expanding experiences are just plain fun. Looking for more of that juice!

[0]:https://en.wikipedia.org/wiki/Answer_set_programming

[1]:https://potassco.org/

3. tempod+Nj2[view] [source] 2024-10-13 05:38:42
>>matt_d+(OP)
The Curry language (https://www.curry-language.org) does look interesting. Does anybody have practical experience with it?
11. amboo7+3v2[view] [source] 2024-10-13 08:23:32
>>matt_d+(OP)
https://icfp24.sigplan.org/home/minikanren-2024#event-overvi... is related. There has been work on turning slow bidirectional code into faster functional code (in 2023, reachable from this url).
15. jkbyc+mC2[view] [source] 2024-10-13 09:59:02
>>matt_d+(OP)
There's an insightful critique of the paper on Reddit: https://www.reddit.com/r/ProgrammingLanguages/comments/1g1su... ...agree that it's weird the paper doesn't mention constraint logic programming, but it's perhaps pointing at it implicitly by saying "Replacing backtracking by complete search strategies"
17. rebane+0I2[view] [source] 2024-10-13 11:09:06
>>matt_d+(OP)
Datalog does not use backtracking, and is getting ever increasingly more popular.

See: - The fastest non-incremental embedded Datalog engine https://github.com/s-arash/ascent - The state-of-the-art non-embedded and non-incremental Datalog engine https://github.com/knowsys/nemo - A python library that contains an embedded incremental Datalog engine https://github.com/brurucy/pydbsp - A Rust library that provides a embedded incremental Datalog engine over property graphs https://github.com/brurucy/materialized-view

19. ristos+VP2[view] [source] 2024-10-13 12:37:31
>>matt_d+(OP)
There already is a pretty major effort around the prolog community to build everything as much as possible around pure, monotonic prolog, and to provide a means to support multiple search strategies depending on the best fit for the problem. CLP libraries are also pretty common and the go-to for representing algebraic expressions relationally and declaratively.

I wouldn't say that the logic or relational way of describing effects is a bad thing either. By design it allows for multiple return values (foo/1, foo/2, ...) you can build higher level predicates that return multiple resources, which is pretty common for many programs. It makes concatenative (compositional) style programming really straightforward, especially for more complex interweaving, which also ends up being quite common. Many prolog implementations also support shift/reset, so that you can easily build things like conditions and restarts, algebraic effects, and/or debugging facilities on top. Prolog is also homoiconic in a unique way compared to lisp, and it's quite nice because the pattern matching is so powerful. Prolog really is one of the best languages I ever learned, I wish it was more popular. I think prolog implementations need a better C FFI interop and a nicer library ecosystem. Trealla has a good C FFI.

I think logic programming is the future, and a lot of these problems with prolog are fixable. If it's not going to be prolog, it'll probably be something like kanren and datalog within a lisp like scheme or clojure(script).

This is a great resource for getting a good feel of prolog: https://www.youtube.com/@ThePowerOfProlog/videos

◧◩
30. YeGobl+T63[view] [source] [discussion] 2024-10-13 14:50:24
>>ristos+VP2
SWI-Prolog has a robust and very mature foreign interface to C++:

https://www.swi-prolog.org/pldoc/doc_for?object=section(%27p...

◧◩◪
31. YeGobl+a73[view] [source] [discussion] 2024-10-13 14:52:16
>>bradle+Il2
>> What would be interesting, would be to replace depth-first search while remaining in the world of predicates and Horn clauses.

For that you want tabled Prolog, or in other words Prolog executed by SLG-Resolution. The paradigmatic implementation is XSB Prolog:

https://xsb.com/xsb-prolog/

SWI-Prolog also supports tabling but I think the XSB implementation is more mature.

◧◩
43. mdanie+2m3[view] [source] [discussion] 2024-10-13 16:42:56
>>tempod+Nj2
while I was trying to track down the license for it[1], I found https://git.ps.informatik.uni-kiel.de/curry/curry2go (also BSD-3) which says "A compiler and run-time system to compile and run Curry programs as Go programs"

1: maybe this is it? it does include a lot of curry named submodules; anyway, BSD-3-Clause https://git.ps.informatik.uni-kiel.de/curry/pakcs/-/blob/v3....

◧◩◪◨⬒⬓⬔
45. usgrou+xy3[view] [source] [discussion] 2024-10-13 18:18:09
>>Legion+Ce3
Countable infinity does not work like that: two countable infinities are not more than one countable infinity. I think it falls into the "not even wrong" category of statements.

The Wikipedia article is fairly useful: https://en.wikipedia.org/wiki/Countable_set

◧◩◪◨⬒⬓⬔⧯
51. Legion+4E3[view] [source] [discussion] 2024-10-13 18:56:48
>>usgrou+xy3
Yes, if you put two (or three, or countably many) countable sets together, you obtain a set that is also countable. The problem is, we want to explicitly describe a bijection between the combined set and the natural numbers, so that each element is visited at some time. Constructing such a bijection between the natural numbers and a countably-infinite tree is perfectly possible, but it's less trivial than just DFS or BFS.

If we're throwing around Wikipedia articles, I'd suggest a look at https://en.wikipedia.org/wiki/Order_type. Even if your set is countable, it's possible to iterate through its elements so that some are never reached, not after any length of time.

For instance, suppose I say, "I'm going to search through all positive odd numbers in order, then I'm going to search through all positive even numbers in order." (This has order type ω⋅2.) Then I'll never ever reach the number 2, since I'll be counting through odd numbers forever.

That's why it's important to order the elements in your search strategy so that each one is reached in a finite time. (This corresponds to having order type ω, the order type of the natural numbers.)

◧◩◪◨⬒
53. YeGobl+RO3[view] [source] [discussion] 2024-10-13 20:20:14
>>upghos+Se3
>> Right now this seems to have all the downsides of programming exclusively with "magic strings", and I haven't been able to find any cure for it or even seen this problem discussed elsewhere.

The cure is to not try to program with "magic strings". You don't need to, and if you really want to, then you should try to understand what exactly it is that you're doing, and do it right.

Specifically, what you call "magic strings" are what we call "functions" in First Order Logic, and that Prolog calls compound terms, like carring_rock(robot(R)) which is in fact two compound terms, nested. Prolog lets you nest compound terms infinitely and if you really want to hurt yourself, one great way to do it is to nest terms everywhere.

The alternative is to understand that when you use compound terms as arguments to predicates (or other terms) you are really _typing_ those arguments. Above, "carring_rock(_)" is the type of the second argument of result/3, and "robot(_)" is the type of the single argument of carring_rock/1. The catch is, of course, that Prolog is not a typed language and it doesn't care if you want to hurt yourself. So if you need to have types like that, then you should write some code to explicitly handle types and do some type checking. For instance, right out the top of my head:

  result_type(T):-
      T =.. [carring_rock,R]
      ,robot_type(R)
      ,!.
  result_type(T):-
      throw('Unknwon result type':T)
  
  robot_type(T):-
      T =.. [robot,R]
      , % ... further typing of R
      ,!.
  robot_type(T):-
      throw('Unknown robot type':T)
Note that this is not the right way to throw errors but I can't now.

A simpler thing I'd advise, but that's going into style territory, is to keep variable names as short as possible. One reason it's hard to spot the mistake in the code above is that the source code is all cluttered with long variable names and the nesting of compound terms makes it even more cluttered. An IDE with good syntax highlighting can also help. Perversly, I find that there are many people who code in Prolog either without syntax highlighting at all or in IDEs that are not aware of common results of typos, like singleton variables. The SWI-Prolog IDE is good for this and Sweep for Emacs has a good reputation also (although I haven't tried it):

https://eshelyaron.com/sweep.html

Edit: it just occurred to me that the project I'm working on currently, in my post-doc, involves quite a bit of typing using compound terms as arguments, like you do above. I've opted for a program synthesis approach where the predicates I need with typing are automatically generated from a specification where I define the arguments of predicates' types. Doing the same thing by hand is probably my number two recommendation of how not to code in Prolog. Number one is "the dynamic database is evil", but does anyone listen to me? Never.

Edit 2: Covington et al's Coding Guidelines for Prolog make the same point:

  5.13 Develop your own ad hoc run-time type and mode checking system.

  Many problems during development (especially if the program is large and/or there
  are several developers involved) are caused by passing incorrect arguments. Even
  if the documentation is there to explain, for each predicate, which arguments are
  expected on entry and on successful exit, they can be, and all too often they are,
  overlooked or ignored. Moreover, when a “wrong” argument is passed, erratic be-
  havior can manifest itself far from where the mistake was made (and of course,
  following Murphy’s laws, at the most inconvenient time).

  In order to significantly mitigate such problems, do take the time to write your
  own predicates for checking the legality of arguments on entry to and on exit from
  your procedures. In the production version, the goals you added for these checks
  can be compiled away using goal_expansion/2.
https://arxiv.org/abs/0911.2899

>> I could linearize the arguments and include them in the body but this destroys the indexing and can put me into "defaulty representation" territory.

Sorry, I don't know what either of those are: "linearize the arguments" and "defaulty representation".

Prolog only has first-argument indexing normally, although some implementations let you change that and use your own indexing scheme. Is that what you did?

◧◩◪◨⬒⬓
55. YeGobl+IP3[view] [source] [discussion] 2024-10-13 20:26:57
>>foobie+iP3
Then don't use Prolog. It's not mandatory.

For the record I never have problems like that and I'm sure I'm not special. Well, not in that way. This all comes under the heading of "learn what works". You have to do that with any language.

Edit: as a slightly less flippant answer (sorry) Prolog doesn't "do a lot of manual stuff that the language should do for you" because the Prolog community doesn't think the language should do those things for you and I agree. Take for instance inheritance and composition, like in object orientation. There's no reason Prolog should do that for you. If it did, it would shoehorn you into a certain programming paradigm that can feel like a straightjacket when you don't need it, but you absolutely need to use it because that's what the "language does for you". Prolog instead gives you the tools to do all the things you need, when you need them. It's more work, for sure, but it's also more freedom. I spent most of my career in the industry working with very rigidly OOP languages and that's one reason why I escaped into academia, where I can program in Prolog (see what I did there?) all day without having to declare a class property if I don't want to. And if I really wanted to, I could go all the way and write something like Logtalk:

https://logtalk.org/

Another example of something that Prolog doesn't do for you, and that you don't always need, but can always do yourself if you need it, is typing; like I point out in the sibling comment. Why should Prolog do that for you? Are we going to have the whole argument about strongly typed vs. weakly typed languages all over again? I think not. If you want types in Prolog, you can roll your own. Now try to write, say, C# code without types, when you really don't need the bother.

◧◩◪◨⬒⬓
56. YeGobl+oR3[view] [source] [discussion] 2024-10-13 20:38:59
>>agumon+yA3
I use the SWI-Prolog IDE:

https://www.swi-prolog.org/PceEmacs.html

I suppose it is a bit spartan, but it has a ton of functionality that I find indispensable [1]. For example, when I place my cursor on a variable in the editor it highlights all the variables that unify with it in the same clause, and it will highlight singleton variables in a different colour so you can catch errors caused by typos easily. It is also aware of things like imported predicates, undefined or dynamic predicates, multi-file predicates etc, and will highlight them accordingly. It has some (limited) auto-complete and code-expansion etc. and a status line that gives you very good information about the kind of term you have your cursor on - where it's defined if it's a predicate, its name and arity and how many clauses etc, basically much of the information you can get from querying the program database with various program inspection built-ins. Some of my colleagues use VS Code to write Prolog instead and I keep shaking my head and grumbling about the errors they keep making that they wouldn't if they used the SWI-Prolog IDE instead. Kids, these days. In my youth, we wrote all our Prolog on Notepad! And compiled on Dos!

(nope. never)

SWI also has a graphical debugger, which however I never use:

https://www.swi-prolog.org/pldoc/doc/_SWI_/xpce/prolog/lib/g...

I know some people swear by its name but I prefer the textual debugger. Again this one looks a little spartan :)

More goodies in SWI: cross-referencer:

https://www.swi-prolog.org/gxref.md

And profiler:

  1 ?- profile(between(1,10,_)).
  =====================================================================
  Total time: 0.000 seconds
  =====================================================================
  Predicate                       Box Entries =    Calls+Redos     Time
  =====================================================================
  between/3                                 1 =        1+0         0.0%
  true.

And lots and tons of facilities for debugging and error handling, unit testing, all sorts of libraries, a package manager etc.

________________________

[1] You can customise the IDE colours too. There's a dark theme:

https://www.swi-prolog.org/pldoc/man?section=theme

There's some screenshots here:

https://swi-prolog.discourse.group/t/questions-about-ide-the...

◧◩◪◨⬒⬓
64. upghos+8F4[view] [source] [discussion] 2024-10-14 05:29:23
>>YeGobl+RO3
First of all let me say a big thank you for this extremely thoughtful and well crafted response, I genuinely appreciate it.

On mobile and I want to dig into this some more tomorrow but let me start by addressing the last two points.

"Defaulty representation" is described here by Marcus Triska (towards the bottom of the page): https://www.metalevel.at/prolog/data

The terminology of linearizing the arguments I lifted from the SICStus IDE features page: https://sicstus.sics.se/spider/index.html

Thanks again and I'm reviewing your thoughtful comments, thank you again.

◧◩◪◨⬒⬓⬔⧯▣
67. YeGobl+Oa5[view] [source] [discussion] 2024-10-14 11:19:40
>>agumon+fa4
That sounds like iterative deepening without a lower bound then. I guess that's possible. Maybe if you had a link to Markus' page I could have a look.

There are techniques to constraint the search space for _programs_ rather than proofs, that I know from Inductive Logic Programming, like Bottom Clause construction in Inverse Entailment, or the total ordering of the Herbrand Base in Meta-Interpretive Learning (ILP). It would be interesting to consider applying them to constraint the space of proofs in ordinary logic progamming.

Refs for the above techniques are here but they're a bit difficult to read if you don't have a good background in ILP:

http://wp.doc.ic.ac.uk/arusso/wp-content/uploads/sites/47/20...

https://link.springer.com/content/pdf/10.1007/s10994-014-547...

◧◩◪◨⬒⬓⬔⧯
69. YeGobl+jc5[view] [source] [discussion] 2024-10-14 11:33:36
>>nextos+q04
I moved to academia, after six years of working in the industry mainly with C# and SQL. It was a deliberate attempt to find a way to work with Prolog. I guess that's a bit immature of me but I fell in love with Prolog in the second year of my CS degree and I couldn't get over it so here I am.

I did an MSc in data science first, then started a PhD to study Inductive Logic Programming (ILP), which is basically machine learning × Prolog (although there's also ASP ILP these days). I got my PhD last summer and I'm now doing a post-doc on a robotics project with Meta-Interpretive Learning (MIL), a recent form of ILP. Here's my latest publication:

https://github.com/stassa/ijclr_2024_experiments

Which is still a bit proof-of-concept. We're still at the very early stages of practical applications of MIL and so there's a lot of foundation work to do. Bliss :)

◧◩◪◨⬒⬓⬔⧯
70. YeGobl+Jc5[view] [source] [discussion] 2024-10-14 11:36:50
>>agumon+Rg4
Right, it's an emacs clone. There's also Sweep, an emacs mode for Prolog.

https://eshelyaron.com/sweep.html

When I grow up, I'll give it a try :)

◧◩◪◨⬒⬓⬔⧯
75. upghos+6p5[view] [source] [discussion] 2024-10-14 13:23:16
>>YeGobl+ze5
Apologies, it was on the tips and tricks page. I'll repost here:

  Linearize Arguments
  Ensures that each argument position is a variable that does not occur elsewhere in the clause head. As an example,
  
  foo(a, p(X1), X) :-
        body(a, X2, X2).

  would become:

  foo(A, A1, X) :-
        A = a,
        A1 = p(X1),
        body(a, X2, X2).
https://sicstus.sics.se/spider/tips.html

I'm really enjoying the convington et al read, cool to see that O'Keefe is an author too!

◧◩◪◨⬒⬓⬔⧯▣
77. YeGobl+aC5[view] [source] [discussion] 2024-10-14 14:56:40
>>upghos+6p5
Ah, thanks. There's nothing wrong with that except that it makes code more verbose. An alternative is flattening, where you remove the compound arguments from the heads of predicates by defining them as new predicates.

Let me see if I can apply flattening to the example above:

  % Flattened version:
  foo(a, A1, X) :-
        q(A1)
        ,body(A, X2, X2).

  q(p(X1)):-
    % ... code that binds X1
    .
With flattening only compound terms are replaced, not constants so "a" remains unchanged. It's similar to an ad-hoc type system again, but the cool thing is that it can be done automatically. There's an algorithm for it known to the Inductive Logic Programming (ILP) community. Let me see if I can find that paper... Yep:

Flattening and Saturation: Two Representation Changes for Generalization, Céline Rouveirol, 1994.

https://link.springer.com/content/pdf/10.1023/A:102267821728...

The example is a bit unusual though because it only checks that A1 = p(X1) and doesn't do anything with X1, it even leaves it dangling as a singleton; same with X and X2. That's very unlikely for real-world code where you want to use unification as a mechanism to pass the state of the computation around your program. With flattening you want to have an extra argument in the new predicate that "returns" a value that you want to process further. I think maybe they meant to write it like this:

  foo(a, p(X1), X) :-
        body(a, X1, X).

  foo(A, A1, X) :-
        A = a,
        A1 = p(X1),
        body(a, X1, X).
In which case the flattened version would be like:

  foo(a, A1, X) :-
        q(A1,X1),
        body(a, X1, X).

  q(p(X1),X2):-
    % ... code that binds X2
    .

So, I guess, we see that this is a known gotcha and the logic programming community has different ways to work around it. In the ILP community there's a different motivation (to make a language finite, and therefore decidable, by removing "functions").

>> I'm really enjoying the convington et al read, cool to see that O'Keefe is an author too!

Yeah. I haven't heard much from O'Keefe lately and I miss his advice. He was a regular contributor to the SWI-Prolog mailing list when I was doing my MSc in 2014. I think there was a bit of a problem with the SWI-Prolog community moving to Google and he hasn't returned since, even though the community is now on Discourse.

◧◩◪◨⬒⬓⬔⧯▣▦
84. jodrel+cd7[view] [source] [discussion] 2024-10-15 02:13:47
>>YeGobl+Oa5
> "Maybe if you had a link to Markus' page I could have a look."

e.g. here: https://www.metalevel.at/tist/ solving the Water Jugs problem (search on the page for "We use iterative deepening to find a shortest solution") finding a list of moves emptying and filling jugs, and using `length(Ms, _)` to find shorter list of moves first.

or here: https://www.metalevel.at/prolog/puzzles under "Wolf and Goat" he writes "You can use Prolog's built-in search strategy to search for a sequence of admissible state transitions that let you reach the desired target state. Use iterative deepening to find a shortest solution. In Prolog, you can easily obtain iterative deepening via length/2, which creates lists of increasing length on backtracking."

◧◩◪◨⬒⬓⬔
86. YeGobl+7w8[view] [source] [discussion] 2024-10-15 14:57:01
>>upghos+a06
I'm glad it was of help.

>> the dang thing about Prolog is the more you look into it, the more powerful you realize it is, but I'll be damned if it's not impossible to find these things out without someone telling you about them. Most languages seem to be possible to self-teach, but if I were to self-teach Prolog I never would've learned about the concept of monotonicity and I'd be using cut operators willy-nilly etc, lucky that I stumbled on Markus's blog/videos and lucky that I bumped into you!

I hear you. When I first learned Prolog, in the second year of my CS degree course, the first thing I did was to go to my university's library and gather up all the textbooks I could find about it. I literally came home bent over from the sheer weight of books. It still took me a very long time to learn the ins and outs of it. It doesn't help that Prolog has a small and probably shrinking community and many of the people with long backgrounds and deep knowledge of its practice are starting to retire from public fora; like O'Keefe.

There are only a few resources on the internet that can help you understand Prolog. Markus Triska's as you found out is one of the best (although I have a very different view of many things than he does, there's no doubt he is an expert in the language). When I was learning I got a lot of mileage out of a tutorial on the Amzi Prolog website:

https://amzi.com/AdventureInProlog/index.php

The best thing you can do of course is to keep working with it as much as you can. Things eventually make sense and you learn what works and what doesn't with experience. That's not always easy to do of course. As I say in another comment I had to leave (my lucrative career in) the industry and enter academia (which pays peanuts) to find a place where I could really work as much as I wanted with Prolog.

>> I am actually really shocked to hear that the advice is to roll your own type system, and it's really interesting. I wonder if that is also the case for other expressive languages such as Forth/PostScript, etc. Most of the languages I work with more often (Python/JS/TypeScript/Clojure/C# ... but PARTICULARLY Python) tend to view runtime assertions and type-checking as an anti-pattern.

I don't want to sound even more like a zealot than I already do, but the thing is you can do all that in Prolog more easily than in other languages. In general the way I program in Prolog is that I create a language to describe my problem, and then solve my problem in that language. I don't mean that I necessarily design a new programming language and write a parser for it, although you can do that very easily in Prolog with Definite Clause Grammars (you get a decent parser and generator that way though not optimal). I mean that I make up a language to talk about the domain, and use it to describe my problems and their solutions. You do that too, in your solution/3 predicate above, where as far as I can tell you describe resources and actions of a robot etc. Again Prolog lets you do that easily: you can define the abstractions you need. That goes all the way up to the abstractions used in other programming paradigms, like functions or objects etc, and types.

>> I am particularly grateful because most of the books as you know were written a long time ago and tend to focus more on the beauty of the language and tend to gloss over the software engineering or composing large reliable problems.

Yeah, I tend to ignore those things. I love Prolog but I recognise it's not to everyone's taste. It's quite obvious. It's nice to hear you love it too :)

◧◩◪◨⬒⬓⬔⧯▣▦▧
90. crypto+iO8[view] [source] [discussion] 2024-10-15 16:41:10
>>YeGobl+Yy8
I hope you enjoy it! I sure did. If you enjoy it, you might also enjoy jq (https://github.com/jqlang/jq), which is also a sort of logic programming language (in that it has pervasive generators and backtracking). Icon has an Algol family syntax, while jq is more... Haskell-ish? in a way if you squint hard? Whereas Prolog is a pile of rules. These differences are very interesting. Verse is another language in this vein, and it seems very interesting as well.

Oh, and yesterday's HN thread about Rama is very interesting as well: >>41833629

◧◩
92. klausn+1Cf[view] [source] [discussion] 2024-10-18 05:23:32
>>ristos+VP2
Out of my depth here, but I do have 5 Prolog books I can't seem to let go of. I used to dabble. A lot. Out of all my numerous prog-lang infatuations, only 3 have stuck: Erlang, Prolog and Picolisp. At most, 2 degrees of separation between those 3. No one mentioned Picolisp yet, so I have to, because it has a Prolog embedded in it. This seems like an appropriate place to do it.

https://software-lab.de/doc/ref.html#pilog

◧◩◪◨⬒⬓
93. thesz+N0l[view] [source] [discussion] 2024-10-20 18:14:17
>>YeGobl+RO3

  > Develop your own ad hoc run-time type and mode checking system.
"Any sufficiently complicated List or Prolog program contains an ad hoc, informally-specified, bug-ridden, slow implementation of half of Haskell type inference and checking system."

Modelled after Greenspun's Tenth Rule [1].

[1] https://en.wikipedia.org/wiki/Greenspun%27s_tenth_rule

◧◩◪◨⬒⬓
94. thesz+uol[view] [source] [discussion] 2024-10-20 21:52:55
>>YeGobl+7d5
Have you tried Haskell?

For example, it is possible to embed backtracking logic programming [1] into Haskell with not a big effort.

[1] https://hackage.haskell.org/package/logict

[go to top]