zlacker

[parent] [thread] 20 comments
1. moomin+(OP)[view] [source] 2024-10-14 10:17:59
I feel like CPS is one of those tar pits smart developers fall into. It’s just a fundamentally unfriendly API, like mutexes. We saw this with node as well: eventually the language designers just sighed and added promises.

You’re better off with an asynchronous result stream, which is equivalent in power but much easier to reason about. C#’s got IAsyncEnumerable, I know that Rust is working on designing something similar. Even then, it can be hard to analyse the behaviour of multiple levels of asynchronous streams and passing pieces of information from the top level to the bottom level like a tag is a pain in the neck.

replies(8): >>greene+s >>oerste+j3 >>mschae+S3 >>packet+Oo >>andrew+Os >>neonsu+JJ >>crypto+il2 >>crypto+Kl2
2. greene+s[view] [source] 2024-10-14 10:23:39
>>moomin+(OP)
CPS might be the underlying structure, but that doesn't mean that CPS is the interface.
3. oerste+j3[view] [source] 2024-10-14 10:48:48
>>moomin+(OP)
I agree. I'm sure that CPS has much more robust theoretical roots and that it's more general and powerful, but in practice it doesn't often look much different from classic callback-hell.

Generally, I prefer the coroutine/generator style, it is more explicit and straightforward syntax-wise. More importantly, it decouples operation execution from chaining. A function that emits multiple values in sync/async shouldn't be responsible for running the next function in the pipeline directly. It's better when the user of the interface has direct control over what function is run over which values and when, particularly for parallelizing pipelines.

I do understand that Rama builds such a syntax on top of CPS, and a compiler that implements generators has a similar execution model (perhaps an explicit state-machine rather than leveraging the function stack to do the same thing implicitly).

replies(2): >>pyrale+wg >>gpdere+9L2
4. mschae+S3[view] [source] 2024-10-14 10:54:55
>>moomin+(OP)
> I feel like CPS is one of those tar pits smart developers fall into. ... eventually the language designers just sighed and added promises.

Bear with me, but raising kids taught me a lot about this kind of things.

Even at two or three years old, I could say things to my children that relied on them understanding sequence, selection, and iteration - the fundamentals of imperative programming. This early understanding of these basic concepts why you can teach simple imperative programming to children in grade school.

This puts the more advanced techniques (CPS, FP, etc.) at a disadvantage. For a programmer graduating college and entering the workforce, they've had life time of understanding and working with sequencing, etc. and comparatively very little exposure to the more advanced techniques.

This is not to say it's not possible to learn and become skillful with these techniques, just that it's later in life, slower to arrive, and for many, mastery doesn't get there at all.

replies(2): >>pyrale+y9 >>moomin+Fe
◧◩
5. pyrale+y9[view] [source] [discussion] 2024-10-14 11:52:23
>>mschae+S3
I feel like these explanations based on cognitive development always end up with unprovable assertions which inevitably support their author's views. The same exist about natural language, and they're always (unconvincingly) used to rationalize why language A is better than language B.

In my experience, when you ask people to tell you what "basic" operations they do for e.g. multi-digit number additions or multiplications, you get many different answers, and it is not obvious that one is better than another. I don't see why it would be different for languages, and any attempt to prove something would have a high bar to pass.

replies(1): >>mschae+NZ
◧◩
6. moomin+Fe[view] [source] [discussion] 2024-10-14 12:41:00
>>mschae+S3
I take your point about mastery. Especially FP, where it's very clear that mastery of it is extremely powerful. On the other hand, there are some like our regular synchronization primitives where not even mastery will save you. Even experienced developers will make mistakes and find them harder to deal with than other higher-level abstractions. Where CPS fits on this curve, I don't know. I feel pretty confident about where FP and Mutexes sit. But I have yet to see something where I feel I'd rather use CPS than an async stream result.
replies(1): >>mschae+801
◧◩
7. pyrale+wg[view] [source] [discussion] 2024-10-14 12:55:54
>>oerste+j3
That's because CPS is the callback hell.

Promises are a mechanism that was devised to separate the composition mechanism and the function itself, much like shell pipes exist to separate the control flow from the called function.

In this article, they implement a pipe-like mechanism, that avoids having to do "traditional" CPS. That is why they say the continuation is implicit. That being said, that mechanism goes further than that, and looks very much like Haskell's do-notation which enables programmers to use functional languages in an imperative style without knowing too much of the underlying implementation.

replies(1): >>nathan+Hv
8. packet+Oo[view] [source] 2024-10-14 13:57:51
>>moomin+(OP)
CPS is one of the most obnoxious ways to write code.

Unless you're Gleam, in which case it feels natural and looks pleasant.

9. andrew+Os[view] [source] 2024-10-14 14:26:19
>>moomin+(OP)
Yeah, CPS is best used as an implementation technique, which is how it's used here. I think they even use it to build a stream-like API for their "operations".
◧◩◪
10. nathan+Hv[view] [source] [discussion] 2024-10-14 14:44:46
>>pyrale+wg
The Cont monad in Haskell is only for single continuation targets and can't do branching/unification like Rama. That kind of behavior doesn't seem like it would express naturally or efficiently with just "do".
replies(2): >>pyrale+qZ >>tome+0f1
11. neonsu+JJ[view] [source] 2024-10-14 16:10:55
>>moomin+(OP)
I found myself liking the F# way of consuming/producing IAsyncEnumerable's with taskseq. It's very terse and looks nice: https://github.com/fsprojects/FSharp.Control.TaskSeq?tab=rea...
replies(1): >>moomin+IO
◧◩
12. moomin+IO[view] [source] [discussion] 2024-10-14 16:40:32
>>neonsu+JJ
Apparently Microsoft hold a patent on “yield!”, which makes it all the more frustrating that they haven’t included it in C#.
replies(1): >>neonsu+qV
◧◩◪
13. neonsu+qV[view] [source] [discussion] 2024-10-14 17:18:25
>>moomin+IO
In F#, yield! is a computation expression, C#'s yield within methods that return IAsyncEnumerable<T> works more or less the same way.
◧◩◪◨
14. pyrale+qZ[view] [source] [discussion] 2024-10-14 17:43:12
>>nathan+Hv
Yes, Rama probably isn’t semantically comparable to one single monad.

I was talking about the do notation as a way to sugar the syntax of cps monadic operations into a flat, imperative syntax. This is exactly what Rama is doing.

If you look at a tutorial of what haskell do-notations desugar into, you’ll find the same cps stuff described in this article.

◧◩◪
15. mschae+NZ[view] [source] [discussion] 2024-10-14 17:44:59
>>pyrale+y9
> I feel like these explanations based on cognitive development...they're always (unconvincingly) used to rationalize why language A is better than language B.

I'm not arguing that one language is _better_ than another... just that people are exposed to some programming concepts sooner than others. That gives these ideas an incumbency advantage that can be hard to overcome.

> any attempt to prove something would have a high bar to pass.

Honestly, the best way to (dis)prove what I'm saying would be to put together a counterexample and get the ideas in broader use. That would get FP in the hands of more people that could really use it.

◧◩◪
16. mschae+801[view] [source] [discussion] 2024-10-14 17:47:02
>>moomin+Fe
> Especially FP, where it's very clear that mastery of it is extremely powerful. On the other hand, there are some like our regular synchronization primitives where not even mastery will save you.

This alludes to my biggest frustration with FP... it solves problems and should be more widely used. But by the time people are exposed to it, they've been doing imperative programming since grade school. It's harder for FP to be successful developing critical mass in that setting.

At least, this is my theory of the case. I'd love counter examples or suggestions to make the situation better.

◧◩◪◨
17. tome+0f1[view] [source] [discussion] 2024-10-14 19:12:10
>>nathan+Hv
Could you say more about what branching/unification is in this context, and how Rama supports/uses it?
replies(1): >>nathan+Jf1
◧◩◪◨⬒
18. nathan+Jf1[view] [source] [discussion] 2024-10-14 19:17:15
>>tome+0f1
Those are explained in the post starting here: https://blog.redplanetlabs.com/2024/10/10/rama-on-clojures-t...
19. crypto+il2[view] [source] 2024-10-15 04:53:32
>>moomin+(OP)
IMO TFA makes much of CPS because that's how they chose to implement generators, but the main thing about the language is that it has generators.

Now having generators is nothing new, but I don't want to take too much away from TFA, as there are some interesting things there. I'll limit myself to pointing out that the Icon programming language had generators and pervasive backtracking using CPS in the Icon-to-C compiler, and that other languages with generators and pervasive backtracking have been implemented with CPS as well as with bytecode VMs that don't have any explicit (internally) continuations. Examples include Prolog, Icon, and jq, to name just three, and now of course, Rama.

20. crypto+Kl2[view] [source] 2024-10-15 04:59:40
>>moomin+(OP)
I've actually used hand-coded CPS in an evented program for C10K. Hand-coding CPS is a real pain, but CPS is usually used as an implementation detail, and as such it's not an API. In some cases you can get at an implicit continuation to then use it explicitly (call/cc comes to mind), and then it's an API, sure, but typically one does not have to use it.
◧◩
21. gpdere+9L2[view] [source] [discussion] 2024-10-15 09:25:41
>>oerste+j3
As far as I know, CPS is supposed to be an intermediate form for compilers and a theoretical framework to understand control flow and concurrency constructs. The second part is I think a very good reason why most programmers should learn about CPS; but I don't think CPS was ever meant to be something you were supposed to routinely program into, in the same way you don't routinely program in assembler.
[go to top]