zlacker

[return to "Notation as a Tool of Thought"]
1. azhenl+m[view] [source] 2020-11-30 00:41:48
>>mafaa+(OP)
This is Kenneth Iverson's 1979 Turing Award lecture.
◧◩
2. blulul+Fa[view] [source] 2020-11-30 02:36:59
>>azhenl+m
Yes. There are some deep insights in this exposition. The irony is (in my opinion) APL is the worst Array/Matrix based programming language. In fairness it was also the first, but compared to Matlab or Julia it is not as expressive and feels much harder to use.
◧◩◪
3. patrec+PA[view] [source] 2020-11-30 08:35:16
>>blulul+Fa
If you think APL is less expressive than Matlab, you probably haven't really grasped it, IMO. Having said that, Matlab is optimized for manipulating matrices and replicating the notation of normal linear algebra and has excellent implementations of basically any numerical algorithm that frequently comes up in this . So writing something like chol(X'*X+diag(eig(X))) in an APL will look uglier and quite possibly slower and less accurate and depending on the numerical routines you need, require extra implementation work on your part.

But that overhead is constant factor, more or less anything you can express well in matlab can be expressed straightforwardly in APL, too, if you have the right numerical routines. That's not true in the other direction though: there's a lot of stuff in APL you cannot express adequately in matlab at all. For example J (and these days Dyalog as well IIRC) have an operation called under which basically does this: u(f,g) = x => f^-1(g(f(x)). So you can write geometric_mean = u(mean, log).

It is completely impossible to implement something like "under" in matlab. Admittedly the J implementation at least of deriving a generalized inverse for an arbitrary function f is a somewhat ill-defined hack, but this is still something that is both conceptually and practically quite powerful. Also, whilst Matlab is really clunky for anything that is not a 2D array and hardcodes matrix multiplication as the one inner-product, APL has more powerful abstractions for manipulating arbitrary rank arrays and a more general concept of inner products.

Also, APL has some really dumb but cherished-by-the-community ideas that make the language less expressive and much more awkward to learn, e.g. the idea of replicating the terrible defect of normal mathematical notation where - is overloaded for negation and subtraction to every other function.

◧◩◪◨
4. henrik+AP[view] [source] 2020-11-30 11:34:52
>>patrec+PA
> It is completely impossible to implement something like "under" in matlab.

I’m a little curious about this. Does J have a notion of the relationship between certain functions and their inverse? What is it that enables “under” in J which makes it impossible in Matlab?

◧◩◪◨⬒
5. kliber+GV[view] [source] 2020-11-30 12:38:36
>>henrik+AP
> Does J have a notion of the relationship between certain functions and their inverse?

Yes. Many built-in words have inverses assigned, and you can assign inverse functions to your own words with :. https://code.jsoftware.com/wiki/Vocabulary/codot

EDIT: and here's a table with predefined inverses: https://code.jsoftware.com/wiki/Vocabulary/Inverses

◧◩◪◨⬒⬓
6. henrik+K21[view] [source] 2020-11-30 13:44:03
>>kliber+GV
That can definitely be implemented in Matlab with the symbolic math toolbox.

But interesting nonetheless.

◧◩◪◨⬒⬓⬔
7. patrec+V31[view] [source] 2020-11-30 13:53:25
>>henrik+K21
I have used matlab quite a bit in the past, but not the symbolic math toolbox: can you make a "normal" function definition symbolic retroactively? I thought you need to define the function explicitly as symbolic to start with, or am I wrong?
[go to top]