Type hinting in defrecord?

Is there a way to type hint a record inside another record

I have a record Amount

(ns user)

(defrecord Amount [^int amount ^String currency])

Is there a way to typehint a list of Amounts inside another defrecord


(defrecord Bet [^Amount[] amount ^int game-code ^String description)

Seems like

(ns user)

(defrecord Bet [^"[Luser.A;" amounts ^int game-code ^String desciption])

is the solution. Any other solutions?

1 Like

You might want to look into this defrecord - clojure.core | ClojureDocs - Community-Powered Clojure Documentation and Examples

Aside from maybe interop down stream, what benefit does the non-primitive array hint provide here? From the JVM perspective, it will be an object array (outside of some glacial pending changes for value types that are in the works). I really have not seen this pattern outside of code that has to interop with specific java classes that specify a typed array (typically a string array).

There won’t be any discernable performance benefit that I know of (maybe the clojure compiler will emit type hints for you when getting from the array…). Additionally, you mentioned including a list in your earlier question - the solution you proposed is an array and has different properties that may or may not matter if you expected a list.

Are you coming from a statically typed language to clojure?

The records are a data from a http API. The intent is to spec the response and to save it to a db in a manner that is clear to other developers. What would you suggest?

Any ideas?

Thanks @jogo3000 Any pointers to the spec / type hinting the record? Is spec’ing a record used widely?

I think just using spec or malli would be appropriate and pretty much designed for this use case.

The only times defrecord really shows up is when you want a map-like type with efficient access to a constant set of static fields, where you can define efficient inline protocol definitions. A lot of people coming from statically typed languages go overboard on defrecord instead of using more malleable and extensible maps. I think it’s because records give you a type to dispatch on and static fields…they fill the familiar object + class paradigm or record/struct in other languages. In clojure they are not really the default though.

Instead of a record, you could define a spec that specifies the keys and associated types of the entries (prior to malli, metosin’s other spec lib spec-tools had data-specs that make this really easy and declarative)). Specs in general also admit recursive/nested definitions.

Specs work with any predicate, so you could certainly leverage them on records with the associative specs or include type predicates. Or, just spec a map and ensure the input you are pulling from the HTTP api conforms or validates against the spec.

1 Like

Specing a map and ensuring the incoming response validated against it.

Got it.

The way i see defrecord is that there are a set of keys declared but not enforced strictly.

With spec, it is more about strictness and less about readability at a quick glance. I may be mistaken though. Will check this.


(defrecord point [^long x ^long y])

defrecord does a couple of things, with everything living on top of deftype and the host system (probably the jvm…). On the jvm/clr, defrecord is a macro that defines a map container using deftype to implement all the clojure interfaces necessary for persistentmap semantics, and any user-supplied protocols (that aren’t already supplied by defrecord for the persistent map implementation). The macro also defines convenience functions for constructing the record from positional arguments and from generic maps.

deftype ends up emitting a class (haven’t looked deeply at cljs, your mileage may vary) with potentially typed positional constructors, although the direct constructor (e.g. for the newly defined and loaded class user.point) will have - as per deftype - a constructor albeit with extra fields that the defrecord macro added behind the scenes:

[x y __meta __extmap __hash __hasheq]

The underscored fields are used for implementing/carrying metadata, a map of non-static entries (entries that are not the fields x or y, corresponding to :x or :y in the record), and hash codes for clojure’s equality semantics (typically cached once computed). This is hidden from the caller though, but they exist in the deftype invocation. So the corresponding class user.point actually has a constructor of 6 args if we invoke it directly (I think there are overloads provided to allow just the 2-arg positional field-based constructor as well though). Since we provided primitive type hints on the fields x and y, they are checked in the constructor too, and the provided functions give us vars we can refer to for convenience (and readability) instead of using/importing the class all the time:

user=> (user.point. 1 2 nil nil 0 0)
#user.point{:x 1, :y 2}
user=> (user.point. 1 2)
#user.point{:x 1, :y 2}
user=> (user.point. :a :b)
Execution error (ClassCastException) at user/eval229 (REPL:1).
clojure.lang.Keyword cannot be cast to java.lang.Number
user=> (->point 1 2)
#user.point{:x 1, :y 2}
user=> (->point :a :b)
Execution error (ClassCastException) at user/eval178$->point (REPL:1).
clojure.lang.Keyword cannot be cast to java.lang.Number

Type information will also be propagated on static fields if we access them directly (as opposed to the generic map-access that is also supported):

user=> (let [^point p (->point 1 2) x (get p :x)] (+ x 2))
Boxed math warning, NO_SOURCE_PATH:1:44 - call: public static java.lang.Number clojure.lang.Numbers.unchecked_add(java.lang.Object,long).
user=> (let [^point p (->point 1 2) x (.x p)] (+ x 2))

So there is some limited albeit useful typing provided at the record level. You get constructors that can implicitly act as a validation layer if and only if you use primitive type hints. For class type hints, all bets are off since things are stored as objects and merely hinted/claimed to be what they are. The following example happily passes through the type system:

user=> (defrecord blah [^String x ^String y])
user=> (->blah 2 3)
#user.blah{:x 2, :y 3}

So non-primitive fields on records don’t buy you much on the structural validation front. They can help with supply type hints for subsequent operations on fields though (as long as the hints are correct) to cut off reflection calls.

Given that, we have your observation:

The way i see defrecord is that there are a set of keys declared but not enforced strictly.

The associative semantics for records are similar to maps. You can assoc values into them. Operationally, there is a check in the implementation to see if the key being assoc’d corresponds to a static field on the record, and if so a new record is instantiated with that field value; otherwise the __extmap field is a hidden map of the non-static or external entries and it becomes the target of associations. Given that operations may flow through the record’s constructor, if there are primitive typed fields that end up as the target of an assoc, then you may get some type checking transitively:

user=> (def p (->point 1 2))
user=> (assoc p :x 4)
#user.point{:x 4, :y 2}
user=> (assoc p :x :a)
Execution error (ClassCastException) at user.point/assoc (REPL:1).
clojure.lang.Keyword cannot be cast to java.lang.Number

dissoc works similarly, except if we dissoc a key corresponding to a static field, the implementation coerces the result into a persistent map since the record contract no longer holds:

user=> (type (dissoc p :x))

Non-static fields are redirected to the __extmap as with assoc.

So we can freely assoc within the record contract and maintain any primitive type information and some layer of validation, and we can freely dissoc any non-static fields. If we dissoc a static field, all the goodies from the record (fast static field access, inline/custom protocol implementations, custom type, type hints on fields, ordered static field printing) are ditched in the resulting coerced persistent map. So records can “silently” downgrade based on the semantics of dissoc (caller beware).

I would say the primary benefits of records are efficient access to static fields, primitive/typed fields, support of generic map operations, extensible keys, some IMO weak validation, and inline protocol implementations (one common example is custom clojure.lang.IFn implementation to make the record invokable, it is not by default, this shows up in some AST implementations). They have a nice mix of the efficiencies of objects with name/static fields, while retaining map semantics and extension.

With spec, it is more about strictness and less about readability at a quick glance.

Spec is pretty flexible IMO, particularly with all the regex-inspired matching you can do on data. I think the community precursor (still in use) schema focused on declarative first, and data specs bridged that gap quite a bit to enable simple declarative map specs.

What’s really nice about spec and peers though, is the ability to build up fairly complicated specifications on the structure of the data as opposed to relatively simplistic type information. You have have arbitrary predicates (although it’s typically more useful to use the structure-based primitive specs they provide like sets) and compose them in many ways. It feels more like dependent typing to me, or programmable contracts. The other benefits (generating random data for property based testing) are slick as well. Malli looks really excellent and probably exceeds spec/spec2 in capability from what I have seen; the only downside is it’s not bundled as part of clojure.


i cobbled together a small example using specs, data specs, maps, and your records here.


Thank you so much, it did not strike me that I could apply the Java semantics for the member access!

Very grateful for the code example on highlighting the issues.

Is it possible in this case that since the entity maps are cleary spec’ed out, we will not even need to create the defrecord?

Is it possible in this case that since the entity maps are cleary spec’ed out, we will not even need to create the defrecord?

Yeah, that’s the primary idea. The default recommendation from the clojure core team is that you leverage maps and fully-qualified keywords and specs to handle your information model. You only move to records if you need their features (typically performance, or custom protocol implementation, or a combination of both). There are some potential pitfalls around serialization and differing record implementations (e.g. the class changed at some point so the serialized object may not coerce, unless you serialized it in a clojure form that uses the reader literal). The other (for me more common) one is during dev time, when we may be messing around in the repl a lot, if we have some state defined that is a record, and we find a problem with say a protocol implementation on the record definition, maybe we go to fix the implementation and reload the defrecord form. That creates a new underlying type for the record that is technically not compatible with any instances of the earlier record retained in memory (example from Ben Mabey):

user=> (defrecord x [a b])
user=> (def x1 (x. 1 2))
user=> (defrecord x [a b])
user=> (def x2 (x. 1 2))
user=> (= x1 x2)

They also conflict with the spec vision of using namespace qualified keywords everywhere (I don’t really do this an tend to stick with unqualified by default so it’s not a problem for me) since all the static fields in the record representation are accessed by unqualified keys (like :x vs :some.ns/x). It’s not a huge problem with stuff like the data-specs though, but some people love their qualified keys.

If you don’t need the benefits of the record (there are some good discussions Maps Vs. Records What is 2021 Recommendation for Specs and the interplay with records vs. maps.

To be fair, I had not really dug into the schema lib since it was a precursor to spec and I wasn’t really paying attention to it even when it came out back in the day. It might be right up your alley though, looks actively maintained, and has some tight integration with validation, coercion, and the bases spec/malli cover as well (although no regex patterns and the like, from what I saw), with a stronger focus on concrete specification of nested data (e.g. no need for spec-tools data specs).

I guess the bottom line is: you have options. The recommended default is general/extensible maps + some verification/validation mechanism to help tame the potentially unbounded information model (e.g. spec, malli, schema, with spec being cognitect’s solution). You can still readily apply the same mechanism to records should you choose them though. Records add a little more organizational overhead (more types, additional validation, more hinting) that may be redundant if you are going to spec them anyway (schema has some cool forms that integrate schema definition and record definition pretty nicely though, and I think there are probably similar libs for spec and malli - or could be if one were interested).

Rich’s spec talk
Maybe Not - spec2 discussion and ‘situated programs’ and information models


@joinr Thank you so so much for the effort to enlighten me.

I was not expecting anything more than a simple succinct answer when I posed this question; but the effort you have put in to help educate me on the whole spectrum of spec vs records; I cannot but sing praises.

I am sure I’ve learned a lot more on on specs and records than I did the whole year and my whole team is richer because of this. Thank you.


Line 17 seems to have an extra ‘s’ character in ‘pos’ in your example.