What do deprecations in Google Closure mean to ClojureScript?

Following the Closure mailing list I see that Google Closure, in full or in part, is being deprecated by Google. What do these deprecations mean for ClojureScript and its future? Does anything in Clojurescript fail in a world without Google Closure? I know Advanced Compilation would suffer.

ClojureScript itself is not affected. It uses very little of the library itself, which could easily be replaced if needed.

I asked this question too and felt kind of “jumped on” by some of the folks answering…

I’ve long felt the dependence on the Google Closure Compiler was bad for ClojureScript long-term because for it to work, the generated JS has to be structured a particular way – and a lot of JS libraries out there are not (because the Closure Compiler isn’t widely used, compared to the vast JS landscape).

What’s being deprecated, as I understand it, is the Google Closure Library, not the Compiler, and ClojureScript doesn’t depend on that very heavily – although some cljs programs rely on parts of it. Even so, it’s “only” being deprecated and will stay around as-is for the foreseeable future (so Google says).

The Compiler has not been deprecated (yet) and so the core ClojureScript folks don’t think it’s a problem to continue to rely on it for optimization.

@thheller has written some interesting stuff on this topic: in particular that shadow-cljs is setup to only optimize user-space JS (generated from cljs) and not anything coming from npm, and inferring externs to make that interface as smooth as possible.

I’ve always felt that’s produced kind of a weird “split” in the cljs world: you either use shadow-cljs with its modified cljs compiler, with language extensions etc, so you can leverage the broader JS ecosystem more easily – or you use the official cljs compiler, without language extensions, and deal with the JS library interop via externs directly yourself. And shadow-cljs is the de facto standard these days (80% use in the last “State of…” survey, compared to less than 10% using just cljs.main, although Figwheel still holds about 25% usage).

I think it’s amazing that cljs has been able to maintain so much stability in such a ridiculously churning ecosystem (JS) but the more I work with our large JS codebase at work – React.js, Redux, Immutable.js – the more I wish for a smaller gap between cljs and JS, that would allow me to easily write new parts of our code in cljs… but also the less benefits I feel cljs would bring these days, given how the JS world has moved on: even tho’ all of my backend work is Clojure (and has been for many years now), I’m feeling more and more comfortable with “just JS” in the context of how our codebase is structured at least.

1 Like

The Closure Library we use is published under the org.clojure maven group, so it is entirely irrelevant what Google decides to do. The sources we need will stay exactly as they are. There is no concern of that ever going away.

At this time I’m somewhat uncertain what is going on with the Closure Compiler though. Some somewhat critical things have been pending for a rather long time and not much seems to be happening there. I have no insight into Google, but I do hope they continue working on it as it is still by far the best optimizing Compiler there is. Others still aren’t even remotely close.

While this doesn’t affect CLJS, there is an increaing amount of npm packages that can no longer be processed by the Closure Compiler (and therefore shadow-cljs). The only option remaining for those is actually using JS tools, which is fine but … unfortunate.

I tried my best to avoid this split, and have been very careful to no introduce any special shadow-cljs only things that can’t be replicated in regular cljs.main or figwheel. That however doesn’t stop me from making things easier for myself and other shadow-cljs users where I can. So, that may give the impression that some things are shadow-cljs only, when in fact they are not. cljs.main and figwheel have had access to npm for a very long time too, just with webpack (or equiv) required in the mix.

Most of the inference code lives in the cljs.analyzer itself, so it is available there as well. Yes, shadow-cljs does a bit more inference because it can but the functionality itself is still the same. shadow-cljs just sees more code.

Serious Question: Have you ever tried shadow-cljs? That is exactly what it provides. I know you have been a very dimissive of it due to your dislike of the node/npm “feel”. I assure you node/npm is entirely optional, but since you specifically asked about intergrating closer to JS that feels like a lost cause.

As you said the JS world is churning along rapidly, so it is impossible for me to make an example of CLJS<->JS integration of every possible JS setup out there. I probably don’t know 99% of them by now myself. Yet, I am confident that shadow-cljs is able to integrate with 100% of them on a basic “this works” level, since there are several options for shadow-cljs to generate basic standard code that every JS tool can process. You might lose some features by doing this, but this is often due to a limitation on the JS tool side rather than CLJS.

“just JS” is perfectly valid route to go. CLJS is very different than most JS setups, so there are always going to be mismatches. This is also true for CLJ though, and integrating that into and Scala/Kotlin/Java/Whatever JVM environment is not without issues either.

I don’t see a need for CLJS to change, since I much prefer CLJ(S) and becoming more like X doesn’t feel like the right direction in either case. Just like CLJ doesn’t need to adopt every latest JVM feature. Sure, there are very many useful things we could do, but each of them will likely bring breakage and churn.

3 Likes

Is there a reason the Google Closure Compiler isn’t also maintained and published as part of the org.clojure stuff? It seems even more critical to the cljs stack than the library…

I thought about forking the Closure Compiler before, but realistically that would be gigantic amount of work I don’t have time for. I also believe there is no need to do that. The Closure Compiler is much more widely used than the Closure Library. Inside of Google as well as outside. I’m confident that even if Google would abandon it it would continue to live elsewhere.

In terms of the Closure Library we just copy some JS files into a JAR and publish it. For the Compiler there is much more work to do before a JAR can be published. I don’t think existing packages will ever be removed from maven, so until we make actual changes to the Compiler there is no need to do this.

IMHO there is absolutely no need to be worried about what Google does. I want to emphasize again: The lack of recent feature support to the Closure Compiler does not affect CLJS directly, as we don’t generate these things (and don’t need to). The only reason this is a problem is that the npm/JS world likes to jump on every new feature train that comes along immediately, often even before the spec is finalized or widely implemented in Browsers. Unfortunately JS also keeps adding breaking changes to the JS syntax, so thats the biggest problem as the Parsers/Transpilers need to be updated to support that.

1 Like