My latest blog post, Array programming for Clojurians, grew out of a discussion on the Clojurians Zulip about lessons APL and related languages have for Clojure. It was fun to take a brief tour of their world and find some the analogies between the two.
Fantastic write up. The historical context adds such clarity. I wish more programming concepts were taught this way.
The exploration of array languages (as opposed to Common Lisp or a stack-based language like Forth) made me think of this quote:
I will, in fact, claim that the difference between a bad programmer and a good one is whether he considers his code or his data structures more important. Bad programmers worry about the code. Good programmers worry about data structures and their relationships.
~ Linus Torvalds, https://lwn.net/Articles/193245/
I found it striking that Blazeski reached for macros and you avoided them completely (as a Clojure programmer often does). It made me wonder if this cultural difference is born, in part, from the fact that Clojure’s collection offerings (vector, list, map, set) are both robust and syntactically clear out of the box.
From Blazeski’s own conclusion:
Learning anything of APL, J or q makes programmers aware of opportunities opened by thinking at the higher level of container abstraction.
It has been many years since I’ve spent significant time in CL, but Clojure’s vectors just seem so clearly differentiated. It makes it easy for me to think about what abstraction to reach for.
Furthermore, you use polymorphism where Blazeski uses a macro. Could it be the weight of CLOS that again inspires CL’s use of macros over a polymorphic solution?
If we presume that matching the data structure to the data itself is of preeminent importance, having clearly defined data abstractions with idiosyncratic operators can lead to clear thinking and clear code. The decision between developing a domain specific language vs. using a Lisp’s core features is debatable. But trying different data abstractions/operators (like array-based languages) can lead to new ways of thinking. Pretty cool to see Blazeski hedge towards a DSL and you work with core language features.
I actually expected to need macros on first skim of the article. But then for each task it quickly became clear that there just was no need.
I’d like to see a Common Lisper chime in on a potential CLOS approach.
I think you’re absolutely right about Clojure’s collection API making macros less necessary. You can see this fairly clearly in this 2002 common lisp discussion where someone suggests precisely that approach of abstract data types with underlying implementations that switch according to size of the data. Another contributing factor appears to be that CL devs were simply operating at a different level of abstraction in those days: they want to implement that layer themselves, just the way they talk about implementing their own hash functions or how they just love doing everything from scratch with cons cells. Most developers today don’t do that in their day-to-day.
So far as I can see, the original article only calls out the argument hoisting macro
f as a macro? I imagine he thought to do it that way because he didn’t want the arguments to be evaluated–but who can know the heart of another programmer?
As regards the rest, the Clojure apple has not fallen far from the Common Lisp tree in regards to generic function dispatch:
(defgeneric j+ (x y) (:method (x y) (error "Cannot add those types!"))) (defmethod j+ ((x number) (y number)) (+ x y)) (defmethod j+ ((x vector) (y vector)) (map 'vector #'+ x y)) (j+ 1 1) ;; => 2 (j+ #(1 2 3) #(2 3 4)) ;;=> #(3 5 7)
(Not also that CL’s vectors are distinct from lists and have their own reader form.)
Calling Gerry Sussman into this conversation to re-enforce your argument:
The Structures and Interpretation of Computer Programs (SICP) curriculum no longer prepared engineers for what engineering is like today. Sussman said that in the 80s and 90s, engineers built complex systems by combining simple and well-understood parts.
Today, this is no longer the case. Sussman pointed out that engineers now routinely write code for complicated hardware that they don’t fully understand (and often can’t understand because of trade secrecy.) The same is true at the software level, since programming environments consist of gigantic libraries with enormous functionality. According to Sussman, his students spend most of their time reading manuals for these libraries to figure out how to stitch them together to get a job done. He said that programming today is “More like science. You grab this piece of library and you poke at it. You write programs that poke it and see what it does. And you say, ‘Can I tweak it to do the thing I want?’”. The “analysis-by-synthesis” view of SICP — where you build a larger system out of smaller, simple parts — became irrelevant. Nowadays, we do programming by poking.
The example of generic function dispatch in CL is appreciated. So similar!
I was particularly interested in Blazeski’s speculation a little later:
Writing a macro that understands forks and hooks is an exercise for a beginner – if the operators covered are strictly monadic and/or dyadic as are those in J. However in Lisp all those optional auxiliary and keyword arguments immensely complicate matters. In order to make it operational within a month or so of effort a tacit macro could be implemented to work just on a subset of Lisp operators.
My only point of comparison are the macros in
core.matrix. I haven’t programmed in J, but
core.matrix ends up feeling very Clojure-y. It also looks to use macros in edge cases.
Blazeski’s article ends by speculating on what it would take to build something that felt more like J in CL.
Much has been written about how language shapes our thinking. Without opening an argument on the power of Common Lisp vs. Clojure (the value of reader macros, lisp-1 vs. lisp-2, etc…), it is striking that the thinking is so different with such a similar language.
Clojure comes out of a post-Windows world that thinks about APIs. Common Lisp comes from a pre-Windows world that thinks about DSLs.
Ah, so far as I can tell he’s gesturing toward combinators here (possibly without realizing it).
I’m having trouble figuring out what you mean. One of the first APIs I learned pre-dates Windows by a fair few years, and it seems to me that all Lisps are interested in both APIs and DSLs…?
CL is essentially silly putty–you can make anything with it in any shape you like. Another CLer might well have written approximately the same thing that appears in the Clojure version from the blog post. (That one can produce anything from GOTO-laden spaghetti to functional pearls in CL is both a strength and weakness of the language.)
I’ve always had mixed feelings about that snippet. Of course Sussman knows what he’s talking about, and as you say it’s even part of my point. But though it’s definitely less common to drop into the internals of things nowadays, it’s still absolutely necessary sometimes. And of course even when it’s not necessary to develop something from raw materials, it’s still usually helpful to understand. Programming by poking only goes so far; some number of developers need/want an understanding (in at least one domain) perhaps not from electromagnetics, but at least from hardware all the way up to product.
Fair enough. I’ve always considered the Clojure programmer’s perspective on writing macros a departure from the Lisp tradition. Perhaps this is unfair.
As a point of clarification, I was suggesting that language may not have so much to do with how approaches to programming have changed.
Microsoft built a closed-source software API that drove 90% of all hardware for over a decade - an unprecedented event. I believe this had a tremendous impact on how a generation of programmers approach the craft of writing software. As much as any advance in computer science or language specification.
I’d love to write a book on the impact of the Windows API on both computer science and computer programming. Sort of in the spirit of MIT’s Platform Studies. But that’s for another lifetime.
Is it me, or does this feel like a backhanded criticism to the entire field ?
I had a coworker tell me the other day, at our company, we build robust systems out of twigs and haystacks.
Its somewhat true, but also that was very much a complaint of his to show that we’re just totally dropping the ball and that there doesn’t seem to be any motivation for any kind of careful consideration in building real robust systems. But there was also some hopelessness in his tone, and some abandonment to this faith, as out of his control and inevitable.
I’m feeling a similar tone to that Sussman quote, am I the only one? Is this me projecting my own feelings to it?
It certainly reads like a criticism to me.
But I don’t suppose there are many code-slinging veterans who would suggest there was a golden era where engineering quality was prized in the face of various market pressures.
It’s important that the original quote includes Sussman’s view on intellectual property. How can you do anything but poke at something that you’re not even allowed to understand?
Every programmer is eventually forced to work with systems they don’t understand. That hasn’t changed. But do you think the techniques for engaging the unknown have improved? Gotten worse? About the same?