Generating ES Modules (Browser, Deno, ...)

You really don’t want to do that with CLJS and :esm is not meant to do this. The whole point of Google Closure :advanced optimizations is to optimize your whole program so it can throw away everything that isn’t used. So at the very least you must compile all CLJS together. Never include multiple different CLJS ouputs in the same page. They won’t be compatible with each other and each will include their version of cljs.core making everything bigger than it needs to be.

Taking react/react-dom or other JS deps out of the equation so it can be shared with JS is possible but not yet implemented cleanly. Basically you can (:require ["esm:react" :as react]) which would leave it as import * as ... from "react"; in the actual output and make no attempt at bundling it. With snowpack that would be (:require ["esm:/web_modules/react.js" :as react]).

It reminds me of the difference between ES6 modules and Closure Compiler. As today major platforms have added support for ES6 import/export, now we have chance to use browser native module system, which opens possibilities of “no bundling” deployment. And with help of HTTP2 it’s no longer a huge issue to have tens of small files. While there is still tradeoffs but at least it’s a considerable solution for JavaScript world today.

The difference in ClojureScript is we do need to bundle code, and then optimize. Just difference. However it brings confusions when someone attempts to combine two things together. And I’m don’t know if that could be a good idea.

And speaking of snowpack, I heard more about vite since the author shared quite some ideas on Twitter and in our WeChat group. Webpack is heavy. Vite provides a really fast environment for these people who want to play with lightweight web apps(or especially Vue apps),. No cold start compilation overhead, instant code replacement, makes it really quick.

I did follow vite and it is nice to see other JS bundlers besides webpack trying new ideas.

I do however not agree with the entire premise of its design. Sure fast startup is nice but doing everything to optimize for that is pointless. Assume you work on something more than 15 minutes, it doesn’t matter if its starts in 129ms or 3sec. Loading thousands of files separately on page load is too slow to be practical. ESM makes that a bit better but its still not great. Thats why I added :loader-mode :eval in shadow-cljs. There is a balance to be had here and the more files you have the less useful it is to keep them separate. The slow “bundling” step can be improved with better caching which shadow-cljs already does and webpack will do with v5 too.

I’m also absolutely horrified by the now “common” JS idiom and code layout of one file per function. This is ridiculous in my opinion and not how I want to write code. So if you write more code into one file you do want Closure style DCE to remove everything that wasn’t used. Of course this is all completely subjective and some people may prefer that style.

Ultimately you always want a final bundle step and ESM doesn’t change that.

What I like about snowpack is snowpack install. Couldn’t care less about the other stuff it now does. This is interesting because you bundle everything once, just like npm install and after that you can even keep the files in version control. No need to ever run it again until you change dependencies. shadow-cljs could even consume those files instead of npm directly and either load them via ESM or bundle normally.

I see snowpack as a “take random madness from npm and turn it into somewhat sane ESM”. Of course that becomes less useful once more packages actually ship as pure ESM but its a good step in that direction.

2 Likes

BTW, eval loader mode made hot-reload work while I was developing a plug-in in vscode - it’s the only way to make hot reload work on the webview, so thanks for that :smile:

Let me see if I got it, what you are telling me is that if I want to export web components, each one distributed as a singular package in npm, even if all of them have a dependency in cljs.core and react, these dependencies would be too heavy (because they wouldn’t pass by the compiler advanced mode) and the components wouldn’t work if I define two of them in the same page (maybe I didn’t understand the “compatible with each other” very well), is that right?

But I could create like a “library of components” declaring the common dependencies in the shared module, so they would pass by the compiler advanced mode, right?

I’m not sure I understand your question but as a general rule of thumb the JS users should only ever include output from a single CLJS build in their projects. You may use :modules to split that build into multiple files and allow the user to include single or multiple modules.

As soon as the JS users include output from multiple CLJS builds they will have duplicated code and things become too large to be practical quickly.

2 Likes

Hey Thomas! I was wondering what the current state of the :esm target is with shadow-cljs but maybe also generally with the ClojureScript compiler?

Snowpack seems quite interesting and the ESM target appears to be a nice way to bridge the divide between ClojureScript and existing JS tooling. (Or at least it looks like that from my superficial level of understanding. :slight_smile:)

1 Like

I’m waiting for some things in the Closure Compiler to land before making any other changes.

As far as shadow-cljs is concerned the support is mostly done but I haven’t used it in a real project so there might be issues I’m not aware of. As far as CLJS proper is concerned I don’t know. I’m not aware of anyone working on it.

Eventually I’d like to write a CLJS variant that actually just outputs “modern” JS (ES2018+ or so) instead of the old and “dead” Closure JS. That would however be a completely new compiler since we really can’t retrofit that into the current one. There are drawbacks to doing this though so this is very low priority for me.

2 Likes

Why do you say “dead”? Still seems alive and well maintained by Google. GitHub - google/closure-compiler: A JavaScript checker and optimizer. I see a commit just 1 hour ago from this writing.

The Closure Compiler is alive and well, that is not what I meant by Closure JS.

Closure JS is the “old” module format that used goog.provide and goog.require to do namespacing related things. That style is dead and the Closure Library is slowly migrating to the somewhat newer goog.module style or ESM directly. It is still well supported and not going anywhere but the Closure Compiler is at a point where the newer formats are preferred. The CLJS compiler currently emits that “old” style which is pretty much the reason we sometimes have issues integrating with other JS tools since most other JS tools (besides the Closure Compiler) don’t understand that format properly.

2 Likes

Oh I see, I didn’t know Closure had itself slowly migrated to ESM.

Interesting, are you aware of any further discussions/tickets weighing the pros and cons of emitting goog.module or ESM? Briefly searched Jira but didn’t find much.

It would be ESM only if anything. goog.module doesn’t buy us anything over what we currently have.

Not aware of any “archived” discussions about the topic, talked briefly in #cljs-dev some time ago. If we had pure ESM output basically any JS tool out there would work out of the box. Heck you could even load the output directly in the Browser without bundling. Everything else is pretty much downside since REPL and hot-reload get much harder but it might be worth for some interesting JS-based tools so we don’t have to re-invent everything all the time (eg. storybook).

Closure Compiler is still the best option because of :advanced so I’d never want to give that up but for some tools that just not needed.

1 Like

Hi @thheller,

I see the two Google Closure PRs you mentioned above have been merged and you have recently pushed things to shadow (at least https://github.com/thheller/shadow-cljs/commit/8b7f3964e732bc13a9981ba390908dd90f70f2e0 and https://github.com/thheller/shadow-cljs/commit/7575a7d8533be396cd6a630e6aa36848abebbf0d) related to ESM.

Would you recommend us giving this a spin or are you aware of blocking issues still?

Thank you for your great work as always.

1 Like

Doing what exactly? :browser is still the best option for everything browser based.

The changes that landed mostly don’t affect what shadow-cljs does, just maybe make things a little less hacky. Overall I would still only recommend ESM for targets where no better option exists (eg. Deno). For the browser or node the other targets are still way better.

Doing what exactly? :browser is still the best option for everything browser based.

I’d be interested the following:

  1. using snowpack.dev for local caching. We already load deps like plotly or vega on demand using d3-require but they currently don’t work when offline.
  2. also advanced compiling modern js (typescript) deps like GitHub - codemirror/view: DOM view component for the CodeMirror code editor assuming that the emitted js is Closure compatible.
  3. enabling easy consumption of our ClojureScript code from js
  4. the isolation provided by ESM allowing us to run different versions of the same code alongside each other

After writing the above it’s beginning to dawn on me that only the last two points are really related to switching to ESM modules for our ClojureScript build, is that correct?

Sort of yes.

  1. You can also do this with :target :browser. The tricky part starts when the dependencies you are importing this way (eg. dynamic import()) also start importing other dependencies. Since shadow doesn’t know about these you may end up with many different versions of certain things. Not a big deal for your project but potentially blocker for others.

  2. Not related to :target :esm at all. :js-provider controls this. In theory you could put all the JS on the classpath but the odds of that working successfully are not that great. All ESM code on the classpath will the go through :advanced. Technically there could be a variant of :js-provider :shadow that tries to put some node_modules code through :advanced as well but I’d say it is way too early for that and the overall messiness of ESM on npm needs to clear up a little first. Most of it isn’t actually “standard” and relies on some new idioms the JS world invented (in particular webpack) that aren’t standard or even documented fully. My hope for ESM “fixing” things in the npm ecosystem has disappeared. Figuring all this stuff out is painful and I still don’t have a clue where this is going. Definitely waiting for things to settle before investing more energy here.

  3. That would likely improve with ESM yes.

  4. Only really possible with release builds as the watch/compile builds fake ESM and still actually live in the global scope so having two of those would break things again.

:target :esm currently makes the most sense when shadow-cljs is not bundling any JS dependencies at all and instead those are only loaded via true ESM. Currently Deno is the only runtime where this is a practical possibility since everything is ESM/TS. However snowpack might be an option too but I haven’t tried in a long time but that is about the same area as using webpack today. It might work though.

Thanks for all those clarifications @thheller!

Think we’ll give snowpack.dev a try, together with skypack.dev (oh javascript!) and report back. From https://www.snowpack.dev/posts/2021-01-13-snowpack-3-0 you see that it enables you to write this

// you do this:
import * as React from 'react';
// but get behavior like this:
import * as React from 'https://cdn.skypack.dev/react@17.0.1';

So as I understand you get local caching and version pinning from snowpack and ESM modules compatible with the browser and a CDN from skypack.

1 Like

Although my thread is no longer ClojureScript… calcit-js does explored persistent data structure combining with ES module syntax, that says, emitting code with import / export. It’s still a very young project, but in case anyone wants to try:

I read this post earlier. It’s mostly about Rails and their approach to JS but it also mentions a lot of the benefits in using ESM as release artifact.

With HTTP2, you no longer pay a large penalty for sending many small files instead of one big file. A single connection can multiplex all the responses you need. No more managing multiple connections, paying for multiple SSL handshakes. This means that bundling all your JavaScript into a single file loses many of its performance benefits (yes, yes, tree-shaking is still one).

When you bundle all your JavaScript modules in a single file, any change to any module will expire the entire bundle. […]

[…] you no longer need bundling for performance, you can get rid of the bundler entirely! Simply serve each module directly, as their own file, directly to the browser.

Some things I’m wondering:

  • The above sounds really nice in a lot of ways, why is it not?
    • Gzipping would be much less impactful is one thing I can think of.
  • Output files from different CLJS builds are not compatible. Does this effectively make the ESM approach unusable?

Superficially the ESM approach seems similar to how ClojureScript dynamically loads files during development.

I guess whether this is interesting completely depends on how effective this new delivery approach is compared to the bundle(s) strategy. The caching of rarely changing dependencies seems like a pretty great benefit though.