zlacker

[parent] [thread] 19 comments
1. azhenl+(OP)[view] [source] 2020-11-30 00:41:48
This is Kenneth Iverson's 1979 Turing Award lecture.
replies(1): >>blulul+ja
2. blulul+ja[view] [source] 2020-11-30 02:36:59
>>azhenl+(OP)
Yes. There are some deep insights in this exposition. The irony is (in my opinion) APL is the worst Array/Matrix based programming language. In fairness it was also the first, but compared to Matlab or Julia it is not as expressive and feels much harder to use.
replies(3): >>soline+kb >>patrec+tA >>yiyus+XX
◧◩
3. soline+kb[view] [source] [discussion] 2020-11-30 02:48:52
>>blulul+ja
There's a certain mathematical elegance to APL, I think. When the language is terse enough it helps you visualize and work with the language as a tool of thought--Matlab attempts to map actual mathematics to ASCII which is not that successful for me at least, since it meets a middle ground where it's too difficult for me to think quickly purely in Matlab and it's too high level for it to be useful as a practical language.

Engineers love it for prototyping, though, so maybe I just haven't worked with Matlab enough.

replies(2): >>blulul+Zi >>Someon+EL
◧◩◪
4. blulul+Zi[view] [source] [discussion] 2020-11-30 04:33:11
>>soline+kb
Fair point - personally I found APL to be a little too terse to be readable in ASCII. I think that there is a big difference in the affordances of a chalkboard/paper and a monospaced text editor, and to me APL is too close to paper based notation where it is easy to read and write a larger set of symbols. Matlab has some dedicated notation around matrices but uses more text heavy descriptions beyond that which feels better suited to a command line. Julia takes an even more text based approach and supports notations like list comprehensions which feel easier to learn, read and use than a set glyphs.
◧◩
5. patrec+tA[view] [source] [discussion] 2020-11-30 08:35:16
>>blulul+ja
If you think APL is less expressive than Matlab, you probably haven't really grasped it, IMO. Having said that, Matlab is optimized for manipulating matrices and replicating the notation of normal linear algebra and has excellent implementations of basically any numerical algorithm that frequently comes up in this . So writing something like chol(X'*X+diag(eig(X))) in an APL will look uglier and quite possibly slower and less accurate and depending on the numerical routines you need, require extra implementation work on your part.

But that overhead is constant factor, more or less anything you can express well in matlab can be expressed straightforwardly in APL, too, if you have the right numerical routines. That's not true in the other direction though: there's a lot of stuff in APL you cannot express adequately in matlab at all. For example J (and these days Dyalog as well IIRC) have an operation called under which basically does this: u(f,g) = x => f^-1(g(f(x)). So you can write geometric_mean = u(mean, log).

It is completely impossible to implement something like "under" in matlab. Admittedly the J implementation at least of deriving a generalized inverse for an arbitrary function f is a somewhat ill-defined hack, but this is still something that is both conceptually and practically quite powerful. Also, whilst Matlab is really clunky for anything that is not a 2D array and hardcodes matrix multiplication as the one inner-product, APL has more powerful abstractions for manipulating arbitrary rank arrays and a more general concept of inner products.

Also, APL has some really dumb but cherished-by-the-community ideas that make the language less expressive and much more awkward to learn, e.g. the idea of replicating the terrible defect of normal mathematical notation where - is overloaded for negation and subtraction to every other function.

replies(2): >>moonch+uE >>henrik+eP
◧◩◪
6. moonch+uE[view] [source] [discussion] 2020-11-30 09:20:36
>>patrec+tA
> Admittedly the J implementation at least of deriving a generalized inverse for an arbitrary function f is a somewhat ill-defined hack

Have you seen the version used by dzaima/apl[1]? The equivalent of '(-&.:{:) i.5' works and results in 0 1 2 3 _4.

> APL has some really dumb but cherished-by-the-community ideas that make the language less expressive and much more awkward to learn, e.g. the idea of replicating the terrible defect of normal mathematical notation where - is overloaded for negation and subtraction to every other function

Klong[2] is a partial attempt to resolve this. I won't repeat the arguments in favour of ambivalent functions, as I guess you've heard them a dozen times before

> u(f,g) = x => f^-1(g(f(x)).

Other way round; it's g^-1(f(g(x)))

1. https://github.com/dzaima/apl

2. https://t3x.org/klong/

replies(1): >>patrec+la2
◧◩◪
7. Someon+EL[view] [source] [discussion] 2020-11-30 10:54:44
>>soline+kb
“Engineers love it for prototyping, though”

Makes perfect sense. Matlab is for engineers, not for mathematicians. They use computer algebra systems, proof assistants, etc. Difference is that engineers (and physicists) want answers and don’t care about how they are obtained, while its the reverse for mathematicians.

I think APL, although it, too, is a language for computing numbers, spiritually is a bit closer to mathematics than Matlab.

replies(1): >>soline+xz7
◧◩◪
8. henrik+eP[view] [source] [discussion] 2020-11-30 11:34:52
>>patrec+tA
> It is completely impossible to implement something like "under" in matlab.

I’m a little curious about this. Does J have a notion of the relationship between certain functions and their inverse? What is it that enables “under” in J which makes it impossible in Matlab?

replies(2): >>kliber+kV >>patrec+q21
◧◩◪◨
9. kliber+kV[view] [source] [discussion] 2020-11-30 12:38:36
>>henrik+eP
> Does J have a notion of the relationship between certain functions and their inverse?

Yes. Many built-in words have inverses assigned, and you can assign inverse functions to your own words with :. https://code.jsoftware.com/wiki/Vocabulary/codot

EDIT: and here's a table with predefined inverses: https://code.jsoftware.com/wiki/Vocabulary/Inverses

replies(1): >>henrik+o21
◧◩
10. yiyus+XX[view] [source] [discussion] 2020-11-30 13:05:05
>>blulul+ja
How do you define expressiveness to arrive to that conclusion?
◧◩◪◨⬒
11. henrik+o21[view] [source] [discussion] 2020-11-30 13:44:03
>>kliber+kV
That can definitely be implemented in Matlab with the symbolic math toolbox.

But interesting nonetheless.

replies(1): >>patrec+z31
◧◩◪◨
12. patrec+q21[view] [source] [discussion] 2020-11-30 13:44:13
>>henrik+eP
Here is toy example: I define a function f(x):=1+x2. J will automatically work out the inverse for me, as you can see below:

       f=:(1+*&2)
       f 1 2 3 4
    3 5 7 9
       (f^:_1)f 1 2 3 4
    1 2 3 4
       (f^:_1) 1 2 3 4
    0 0.5 1 1.5
Now obviously not every function is bijective or even if bijective, trivial to invert -- and J doesn't (or at least didn't) have a super well specified way of computing those generalized inverses. But still: "under" is actually pretty cool, even just conceptually I find it quite valuable.
◧◩◪◨⬒⬓
13. patrec+z31[view] [source] [discussion] 2020-11-30 13:53:25
>>henrik+o21
I have used matlab quite a bit in the past, but not the symbolic math toolbox: can you make a "normal" function definition symbolic retroactively? I thought you need to define the function explicitly as symbolic to start with, or am I wrong?
replies(1): >>henrik+9n1
◧◩◪◨⬒⬓⬔
14. henrik+9n1[view] [source] [discussion] 2020-11-30 15:40:54
>>patrec+z31
Yes, the symbolic functions can be made "retroactively". The various conventional functions are overloaded with the symbolic input.

    >> syms x
    >> f = @(x) log(sqrt(x)).^2

    f = function_handle with value:

        @(x)log(sqrt(x)).^2

    >> f(x)
 
    ans = log(x^(1/2))^2
 
    >> finverse(f(x))
 
    ans = exp(2*x^(1/2))
And to implement under:

    function u = under(f, g)

    syms x

    g_inv = matlabFunction(finverse(g(x)));

    u = @(x) g_inv(f(g(x)));

    end
replies(1): >>patrec+Zq1
◧◩◪◨⬒⬓⬔⧯
15. patrec+Zq1[view] [source] [discussion] 2020-11-30 16:00:26
>>henrik+9n1
So if you have odddouble.m with

    function y=odddouble(a,b)
       y=2*x+1
    endfunction
you can do

    >>> h = under(@(x) 1/x, odddouble)
    >>> h(3)
? If so, yeah, I agree you can implement under in matlab (as long as you have the symbolic toolbox as well); in which case it's probably one of very few non-CAS systems where you can define it.
replies(1): >>henrik+7y3
◧◩◪◨
16. patrec+la2[view] [source] [discussion] 2020-11-30 19:34:56
>>moonch+uE
Is there an argument for ambivalent function definitions other than "keyword" recycling (possibly in a mnemonic fashion)?

I hadn't seen Dzaima's APL, thanks! I like that he made a processing binding; APL always seemed like it would be such an obvious choice for doing dweet style graphics code golfing that I wondered why no one seemed to be doing it. A web-based APL would be a better choice though.

replies(2): >>moonch+Nc2 >>dzaima+NM3
◧◩◪◨⬒
17. moonch+Nc2[view] [source] [discussion] 2020-11-30 19:47:52
>>patrec+la2
> web-based

In that case you'll be wanting ngn/apl[1], which runs in a browser and compiles to js.

> ambivalent

The arguments are mostly linguistic. Natural language is also context-sensitive, so we are well-equipped to parse such formations; and they allow us to reuse information. The monadic and dyadic forms of '-' are related, so it's less cognitive overhead to recognize its meaning.

1. https://gitlab.com/n9n/apl

◧◩◪◨⬒⬓⬔⧯▣
18. henrik+7y3[view] [source] [discussion] 2020-12-01 07:38:33
>>patrec+Zq1
Just tested it and it worked fine. Note that your function takes too many parameters in its definition and that functions can't be passed as parameters by name.

It is definitely not as elegant as the built-in facility in J, but definitely doable and usable in Matlab. In fact, I think any language with flexible enough function overloading should be able to implement such a feature.

◧◩◪◨⬒
19. dzaima+NM3[view] [source] [discussion] 2020-12-01 10:51:38
>>patrec+la2
Keyboard space is another somewhat important factor. My layout for dzaima/APL already uses all altgr keys, so I could definitely not afford multiplying the number of needed characters by 2. Not having ambivalently callable operators would also mean needing 2 versions of most of them.

dzaima/APL being written in Java means getting it to run in a browser would be a bit hard, and ngn has given up on ngn/apl, but BQN[0] could definitely get a web canvas based graphics interface.

Somewhat interesting to add to the conversation about Under is that, in my impl, calling a function, calling its inverse, or doing something under it (i.e. structural under) are all equally valid ways to "use" a function, it's just a "coincidence" that there's direct syntax for invoking only one. (Dyalog does not yet have under, but it definitely is planned.)

0. https://mlochbaum.github.io/BQN/

◧◩◪◨
20. soline+xz7[view] [source] [discussion] 2020-12-02 17:20:58
>>Someon+EL
I'm a bit of both, so I guess I take the radical approach--straight from mathematics to C++/ASM/FPGA/ASIC. Ultimately programming languages are just an alternate notative system for mathematics--formal language theory actually formalizes and generalizes this, it's what us Computer Scientist's specialize in generally.

Since the computer is just a glorified calculator with memory (sorry Apple), we can fit the whole thing into a formal mathematical framework.

[go to top]