What is the request to thread mapping when using ring jetty adapter?

I am using ring.adapter.jetty/run-jetty to serve my http requests. As I understand the newer versions of Jetty server do NOT uniquely assign a thread to every incoming request. Instead, they leverage the java.nio async libraries to return the thread to the thread pool when a thread blocks on a IO task (say database call).

I think this question might apply to the JVM platform in general but if I am writing a simple http CRUD application in Clojure using ring-jetty-adapter 1.8.2, then;

  • Does my thread get blocked during the IO (reading/writing to db) or does it get returned to the thread pool? If it gets returned to the thread pool during the blocking phase, then how does it know that an IO heavy task is coming up because at no place in the code do we indicate that we are doing an IO heavy task.
  • If it gets blocked then I am wasting the precious thread’s time. How can indicate to Jetty that the thread should be returned to the thread pool.

The thread-sharing that you mention does not happen unless you make it happen: it is a distinct Java EE Servlet API: see https://docs.oracle.com/javaee/7/tutorial/servlets012.htm. Clojure wrappers bring it to the surface in various ways. For example, Aleph request handlers can return a Deferred. Pedestal request handlers can return a clojure async channel. In both of those cases, there is no constraint (or help) on how the programmer should find a thread to deliver the HTTP response to the Deferred or the async channel.

1 Like

oh, so in the normal case, the unit of concurrency becomes the thread? The number of request that I can handle concurrently == the number of threads available ?

Also, if I understand correctly, this answers seem to indicate that the thread sharing happens by default - https://stackoverflow.com/a/25195392

oh, so in the normal case, the unit of concurrency becomes the thread? The number of request that I can handle concurrently == the number of threads available ?

Right. Unless you purposely use the non-blocking servlet API, which in Clojure is done with techniques like returning the response as a Manifold Deferred or an async channel, according to the servlet-adapter library. You’ll see this in the SO answer too: “The blocking API [in Jetty] uses a special blocking callback to achieve blocking.”

Edit: But I’m not sure what you mean by “the number of threads available”. That is a pretty high number. If I recall, Tomcat used to default to 250 threads. Threads blocked (for I/O) need not bog the system down very much.

1 Like

Jetty 9 is like a hybrid async/sync.

It’ll process everything async, but make them sync if need be. So it’ll create a thread for you to handle the request, and then your handler can block, in which case the thread is blocked, and Jetty will create another thread to handle another request. But your handler can also choose to not block, but go in a sync pending state, on which case jetty will park it and the thread will be released to the pool, so when another request comes in it can reuse that same thread.

It still means it’s up to your handler, if your handler blocks the thread, there’s nothing Jetty can do, the thread is blocked and it’ll need to create another one for concurrent requests.

But it gives the opportunity for your handlers to not block, and instead tell Jetty to be parked and to be woken up when IO is done by Jetty itself.

Yes, if you use Ring’s synchronous handler, and don’t do anything special to become async within it.

That said, the number of requests you are concurrently handling is not the same as the number of requests you can accept. Because there’s also a queue of requests waiting to be handled. So it’s not necessarily that if you reach your configured max thread you’ll just start rejecting incoming requests, often you can accept more of them and queue them up.

1 Like