I did quite a bit of APL programming when I was younger.
When describing APL, people talk about the strange symbols, the mathematics, etc... but I have never seen anyone describe something I only realized after some time: APL makes you approach problems quite differently, when you are familiar with it.
I stopped thinking in 'steps' applied to the individual data points, but rather I solved the problem in my head (and writing the line along the way) by aggregating the data points into larger data objects, and then letting the data objects expand in a many-dimension universe, always larger and larger... and then I simply looked at the resulting mega-thing from a different angle, and started crushing it back along different dimensions until I finally got my answer (and my line completed). The resulting one-liner was very hard to read ... but gave me the correct result.
Inflation, Change of view-point, Big Crush. That is the core of APL.
Yeah I know... sounds crazy. But that was how APL programming felt to me, and I bet I am not alone. No other language I worked with ever triggered in me that kind of mental problem-solving process.
So what I'd like to know is how that "quite different" approach to problems differs from the standard mathematician approach?
I've been playing with J lately. I've also been a longtime numpy user, going back to the days when it was still numarray. Maybe I'm just writing numpy in J, but I find that my approach in both languages is more or less identical: set up a vector, do some matrix operations, maybe some statistical aggregates, write down the answer.
Can you provide an example for which the APL approach is significantly different from what one would normally do? It might help me understand what insights I'm supposed to gain.
It's quite similar to the standard mathematician approach. It's extremely different from the standard (imperative-trained) programmer approach.
(Note your use of "write down the answer": this is a giveaway that you understand it so well that you're not even aware of your understanding, and might therefore find it hard to explain)
Loosely speaking, this is how I program in SQL. Each table is plane floating in a multidimensional space, each relation get pinned from one plane to another, joins are spikey balls, sub queries are recursive non-euclidean spaces, etc. Although I'd say the SQL is more readable than J but the mental visualization you describe is nearly identical.
I agree. "Grokking SQL" (or rather the relational and set concepts behind it) makes it possible to mentally map how you want the query to behave.
Although SQL is way, way more verbose than J/APL it is still extremely readable, even if the query is massive. Untrained SQL users always point to big queries as some sort of code smell when in fact most queries are logically partitioned by virtue of how they work.
Guy Steele actually mentioned this in some talk or other. The money quote (loosely from memory) was "APL makes you a better Common Lisp programmer. I was doing a matrix-tensor multiplication routine, and I was thinking about the nested loops, and then I realized it was mapcar of mapcar of apply, done." Sadly, I can't find the video.
I've dabbled in J, and it's left me with a permanent sense of slight disillusionment with regards to every other numerical programming language I've used. It seems mind-boggling that NumPy, MATLAB, and even Julia lack the versatile broadcasting rules of APL family languages. In J, if you write a simple function that composes several built-in operators, your function is fully vectorized and can act on lists and arrays properly, and even arrays of arrays. In MATLAB, the same function stands a pretty good chance of only accepting scalar inputs unless you put extra effort into making it vectorized, and almost certainly won't do the right thing when given higher-dimensional arrays. Julia seems to likewise default to being mostly scalar-oriented, and only makes it less painful by having efficient JIT, but still lacks the expressiveness advantage.
I am still optimistic about Julia in this regard. That phantom limb pain after having used J is there, but the nice thing is that Julia is very extensible, for example hopefully soon infrastructure to cast a 2x4x4 array into a length 4 vector of 2x4 arrays without copying memory will be implemented which is an important step forward to mimic APL style programming if you think about it.
And this broadcasting mechanism seems very amenable to fusion and targeting of GPU resources. Perhaps Julia could add broadcast, kinda like a list comprehension.
"To believe that “plain language” programming would be more readable is Utopian, even intellectually dishonest. For if I say, “a linear function of a variable is equal to the sum of a constant and of the product of a variable and a second constant”, it is incontestably English but completely obscure, even incomprehensible!"
This is absolutely true. People try to make a big deal out of natural language for both programming and simply interacting with computers. Rarely would a natural language description be easier, clearer, or faster than a more precise interface (e.g. text for programming, mouse/keyboard for interacing with computers).
Of course you wouldn't write it all out like that in any programming language. But instead of writing code like this:
y = a * x + b
You could write code like this:
height = slope * run + ground
It may not be helpful to write out simple math using colloquial language, but it can be helpful to future maintainers if your variables have meaningful names. Calling it "a variable" instead of "a" is not useful, but calling it "slope" can be.
If you are talking about math, sure. But then your average person would be unable to articulate much of that anyways, in English or otherwise, while professionals who are trained in this area are willing to invest in artificial languages to express it concisely.
If you are talking about automating every day tasks through "code," then natural language makes perfect sense; the future their is going to be based on increasingly sophisticated dialogue systems in forms like Siri, Now, Cortana.
Depends on the context, no? Natural language works great when you accept that it is far from context free. Consider "how do I get to the nearest store?" Give me that in a programming language where it is functionally better than what a person can provide? Suddenly, you'll find that you have to start adding in all sorts of additional information that is just assumed otherwise.
And, it is not like this is unique to language. Even in math, one often finds a certain lack of precision that is accepted and more cumbersome to work with than without.
Or, mayhap I just misunderstand more than I understand. :) Very likely.
Edit: There is also the issue of a terrible example. If I were to explain a linear function to someone, I would say more that it is a function where the output changes in constant proportion to changes of the input.
But only if you have the digital flexibility of a monkey.
That said, this is always worth watching as a demo of the language's power: https://www.youtube.com/watch?v=a9xAKttWgP4. I'm not afraid to admit that I wish my language could do that.
I have an issue with APL. My grandfather who is a researcher in forest science (sorry English is not my main language), made his own software in APL to manage his own forest. Now, he is becoming old, and trying to get me to get the software, and to understand it for when he will pass.
The thing is, I'm a C++ dev professionally, but whenever I see his code in APL, I cringe. I can't get to make him understand that building what is basically a dynamic spreadsheet in APL is kind of complicated for me, coming from an OO side of programming. Also, that's the only language he knows.
I have a hard time telling him that all the work he did and still does in APL (I guess now there are like thousands of lines) will just go to trash and my uncles will just use Excel to do that when they'll take over the forest business.
Edit: I have another issue with DyalogAPL. When my grandfather sends me a workspace , I can't open it because DyalogAPL is not that much backward compatible ! So if we don't have the exact same version, I just can't open his workspace. It's 2014 , damn !
+1 for sharing it github (although git sucks ;)
have you actually read the glimpse of heaven article?
the tone of your comment made me suspicious... :/
i have basic, z80/x86/pic asm, forth, pascal, c, rebol, bash, awk, ruby, js experience. i've seen the game of life live coding demo a few years ago and also played with j while it was still closed source. to put it into perspective, im only 38yrs old.
after reading this glimpse of heaven apl article, its vibe hit me so im circling back to apl for the 3rd day already. i find it fun and elegant. the the keyboard layout is not an issue either; ⍳⍴×←=⍺⍵⌈⌊ were all very natural after a few minutes.
i would say, u would benefit greatly from learning apl.
it's definitely not the past.
cool shit is coming up and you will be left out if the only thing is c++ which you are comfortable with...
also it sounds quite heart breaking how you discount your grandfather's work...
kinda disrespectful... and you are even wasting his time left on earth by recommending him to learn python? just spend a few hours with apl and it should be clear to you too why did he say python sucks. maybe try to implement a bit of his code in python or on a spreadsheet. that should be quite informative (and probably transformative) too.
Well, I don't want to... but I have no time nor real interest in trying to understand what he did in APL. My only issue is that he is not interested in hearing from me that nobody will take over what he did.. He won't listen ;( He thinks everything else sucks.. I tried to make him read some stuff about python for example. He said this sucked also ! lol
Just because some tech is old doesn't make it bad. People solving and writing programs in 60s and 70s were just as smart as we are, work to extract their wisdom.
It can actually be fun learning radically different programming languages. I find it improves my programming in my primary language. And you're in the lucky position of having someone who can help you learn.
5) It looks you could easily implement APL as a DSL within an existing language with an extensible parser. Racket, for instance.
I'm working on something along these lines, but not planning on keeping the APL/J-like syntax (the many meanings of juxtaposition makes parsing a line depend on the run time values attached to names). You can get more of the core idea of APL out of prop:procedure than out of extensible parsing.
Are dyads only done by putting the other operand in the string of J code? Or is there a nice way to apply a dyad to two Ruby arrays? (I wasn't sure from the Readme if this is already there or your second point in "future directions," and I'm pretty new to Ruby)
Hehe, that had me chuckle. I'm sorry you got downvoted, that must be by people that either did not go to the article or that don't have a clue what APL looks like in practice.
If you like APL or are in general interested in material like this then you should definitely check out the J language (which is sort of the successor to APL).
No, I don't do that. I didn't find it very hard to learn the full APL keyboard. That said, it should be easy to make a Quail-based input method for it. If you could provide me with a list of the necessary combinations to generate the APL symbols, I could help you out in making it.
He is missing the middle generation - he jumps straight from the 70s (mainframe) to the 90s (Internet) while missing the people like me who grew up with 8-bit systems in the 80s.
When describing APL, people talk about the strange symbols, the mathematics, etc... but I have never seen anyone describe something I only realized after some time: APL makes you approach problems quite differently, when you are familiar with it.
I stopped thinking in 'steps' applied to the individual data points, but rather I solved the problem in my head (and writing the line along the way) by aggregating the data points into larger data objects, and then letting the data objects expand in a many-dimension universe, always larger and larger... and then I simply looked at the resulting mega-thing from a different angle, and started crushing it back along different dimensions until I finally got my answer (and my line completed). The resulting one-liner was very hard to read ... but gave me the correct result.
Inflation, Change of view-point, Big Crush. That is the core of APL.
Yeah I know... sounds crazy. But that was how APL programming felt to me, and I bet I am not alone. No other language I worked with ever triggered in me that kind of mental problem-solving process.