You’re better off with an asynchronous result stream, which is equivalent in power but much easier to reason about. C#’s got IAsyncEnumerable, I know that Rust is working on designing something similar. Even then, it can be hard to analyse the behaviour of multiple levels of asynchronous streams and passing pieces of information from the top level to the bottom level like a tag is a pain in the neck.
Generally, I prefer the coroutine/generator style, it is more explicit and straightforward syntax-wise. More importantly, it decouples operation execution from chaining. A function that emits multiple values in sync/async shouldn't be responsible for running the next function in the pipeline directly. It's better when the user of the interface has direct control over what function is run over which values and when, particularly for parallelizing pipelines.
I do understand that Rama builds such a syntax on top of CPS, and a compiler that implements generators has a similar execution model (perhaps an explicit state-machine rather than leveraging the function stack to do the same thing implicitly).
Bear with me, but raising kids taught me a lot about this kind of things.
Even at two or three years old, I could say things to my children that relied on them understanding sequence, selection, and iteration - the fundamentals of imperative programming. This early understanding of these basic concepts why you can teach simple imperative programming to children in grade school.
This puts the more advanced techniques (CPS, FP, etc.) at a disadvantage. For a programmer graduating college and entering the workforce, they've had life time of understanding and working with sequencing, etc. and comparatively very little exposure to the more advanced techniques.
This is not to say it's not possible to learn and become skillful with these techniques, just that it's later in life, slower to arrive, and for many, mastery doesn't get there at all.
In my experience, when you ask people to tell you what "basic" operations they do for e.g. multi-digit number additions or multiplications, you get many different answers, and it is not obvious that one is better than another. I don't see why it would be different for languages, and any attempt to prove something would have a high bar to pass.
Promises are a mechanism that was devised to separate the composition mechanism and the function itself, much like shell pipes exist to separate the control flow from the called function.
In this article, they implement a pipe-like mechanism, that avoids having to do "traditional" CPS. That is why they say the continuation is implicit. That being said, that mechanism goes further than that, and looks very much like Haskell's do-notation which enables programmers to use functional languages in an imperative style without knowing too much of the underlying implementation.
Unless you're Gleam, in which case it feels natural and looks pleasant.
I was talking about the do notation as a way to sugar the syntax of cps monadic operations into a flat, imperative syntax. This is exactly what Rama is doing.
If you look at a tutorial of what haskell do-notations desugar into, you’ll find the same cps stuff described in this article.
I'm not arguing that one language is _better_ than another... just that people are exposed to some programming concepts sooner than others. That gives these ideas an incumbency advantage that can be hard to overcome.
> any attempt to prove something would have a high bar to pass.
Honestly, the best way to (dis)prove what I'm saying would be to put together a counterexample and get the ideas in broader use. That would get FP in the hands of more people that could really use it.
This alludes to my biggest frustration with FP... it solves problems and should be more widely used. But by the time people are exposed to it, they've been doing imperative programming since grade school. It's harder for FP to be successful developing critical mass in that setting.
At least, this is my theory of the case. I'd love counter examples or suggestions to make the situation better.
Now having generators is nothing new, but I don't want to take too much away from TFA, as there are some interesting things there. I'll limit myself to pointing out that the Icon programming language had generators and pervasive backtracking using CPS in the Icon-to-C compiler, and that other languages with generators and pervasive backtracking have been implemented with CPS as well as with bytecode VMs that don't have any explicit (internally) continuations. Examples include Prolog, Icon, and jq, to name just three, and now of course, Rama.