Migrating from clj-time to java-time

Hi everyone,

I’ve started a process of migrating our system from using clj-time to java-time. I’ve been doing it in small stages since introducing postgres into the system. All the recent namespaces and data has been handled by java-time with great success. I’d like to standardize on java-time in the long run.

At a higher level I’ve also started formalizing what these date values mean to our domain, and this has been a very fruitful exercise. Now that I’ve hit parts of the system where java.time, inst's and joda.* values start flowing the through the same functions it has gotten interesting. I’ve so far managed to use a protocol with “domain specific functions” and extending the various types to do the required dances and it has worked out quite well.

We’re very lucky in our domain that so far we’ve only cared about “local dates”, so working with postgres date columns and Datomic :db.type/instant attributes has been smooth sailing.

My question is about getting java.time.LocalDate values in and out of Datomic, and making sure I don’t accidently shift a day? I understand Datomic only uses instants (java.util.Date) values under the hood.

Reading existing values I know I can pass them to java-time/local-date providing UTC as the timezone and I’ll get the date I want. I accept this because everything I’ve read about java.util.Date and instants told me the underlying value is a UTC value, regardless of what toString() produces in the repl.

For casting to an instant I’ve found an example using the java.util.Date.from(), passing in the java.time.LocalDate object, and I still need to play with that a bit more to ensure it does what it says on the tin.

Is there something obvious I’m missing in this context?

I’m well aware of the obvious insanity of switching date/time handling approaches :wink: I also know we’re very fortunate that we don’t need to deal with time, or time zones, only local dates, and this significantly de-risks the entire endeavor.

Even if you have experiences to share about the differences between the two libraries, and the gotchas in their respective ways of thinking about dates, we could build up some knowledge here for others to find in the future.

I feel your pain… When I started with Clojure (2011 for production work, I’d been playing with it for a year before that), we were on Java 7, and using java.util.Date. primarily with date-clj for date arithmetic. Since we had every system on UTC, we started using clj-time for UTC dates and arithmetic – but then explicitly corced to/from java.util.Date around database access. And since then we moved to Java 8, and started using Java Time natively as well (because it’s a better API) – and more recently the clojure.java-time library.

With clojure.java-time, it’s pretty easy to go back and forth between Java Time values and either instants or SQL date/time stuff but I agree that it’s very painful to work in a code base that has both clj-time and Java Time in it (even wrapped in clojure.java-time).

I’m not sure what the best recommendation is at this point. We still most allow java.util.Date objects to flow around our application because that means no conversion in or out of the database (MySQL in our case). So we use Java Time where we need to do date arithmetic (mostly). If we were starting over, I think we’d put in place coercions around the DB access so we used Java Time Instants everywhere instead.

Full disclosure: I’m a co-maintainer of clj-time and I’m pretty vocal about encouraging people not to use clj-time when starting a new project: use Java Time instead. Conversion from an existing, clj-time-heavy project is another matter tho’, unfortunately.

Thanks Sean, this is insightful!

I reread @plexus’s post on Dates in Clojure yesterday after posting this, and spent a few hours changing most of my “new” code to work on java.time.Instant values. I also refactored the new protocol I introduced to help with the multiple different types, but what I did here that helped clarify things a lot was to be more specific with the function names, more domain-like, and it is looking really good so far.

Especially I’ve added at-start-of-day, since thats what we care about and instants convey hour/minute/second values, and I’ve added to-datomic-inst and from-datomic-inst that I can use at the edges of getting values in and out of datomic. On the postgres side we’re so lucky thanks to the hard work done by the luminus community, everything going in and out of postgres flows through clojure.java-time.

This is messy, for sure, however I’m super impressed with how relatively smooth this transition is, all things considered. I think having this protocol is a saving grace, as is having formal ways of expressing what the date values convery rather than “reading between the lines”.

Thanks for helping with clj-time, when I first started using it it really changed how I thought about working with dates in general for the better.

1 Like

encouraging people not to use clj-time when starting a new project: use Java Time instead.

Libs like jsonista support now Java8 time instances out-of-the-box - using joda-time would require extra bootstrapping and extra dependency. Pushing the changes to other libs too.

Thanks Kenneth for reminding me of that post, I added an update to the start of the blog post, since the advice given at the end to use JodaTime as the common format is no longer what I would advice today.

I’ve added to-datomic-inst and from-datomic-inst that I can use at the edges of getting values in and out of datomic

I believe the Nubank folks have their own variant of d/entity that does these conversion automatically. That’s something to look into. The more you can convert to a common (e.g. java.time) format at the edges automatically the better IMO. Maybe @philomates can tell you more about that.

1 Like

Hey! Looking at the internals our datomic api we do have some automatic conversion going on. This is my first time diving into that code, but here is what I can understand from skimming it:

The way we wire up automatic conversions is by introducing a new :db/transformer attribute to our datomic schemas.
Then our LocalDateTimes are stored as insts with the :db/transform attribute set to :date-time.

To store and retrieve values tagged with :date-time we essentially do (fn [inst] (java.time.LocalDateTime/ofInstant (java.time.Instant/ofEpochMilli (.getTime inst)) java.time.ZoneOffset/UTC)) and (fn [local-date-time] (-> local-date-time (.toInstant java.time.ZoneOffset/UTC) .toEpochMilli java.time.Date.)) respectively.
We use this setup for other data-types like edn and urls.

What is especially nice about all this and allows developers to ignore all these details, is that we have a plumatic schema to datomic schema converter. Engineers interact with datomic by manipulating objects that satisfy plumatic schemas. Under the hood plumatic schemas are registered with datomic automatically and values read from datomic are transformated automatically.

Of course, take all this at face value; I am simply a user of this setup and am skimming the code for the first time. For instance, there seems to be plenty of caching and lazy resolution of datomic results to make the whole plumatic thing work


This topic was automatically closed 182 days after the last reply. New replies are no longer allowed.