Sync vs Async

I’ve been telling people that Sync means “happening at the same time” and Async means “Not happening at the same time”. And people have challenged me for this interpretation, but I stand by it. Not only is it the dictionary meaning, but it’s also the logical CompSci way to interpret them. And I’ll explain:

In a single threaded computation model, things can never happen at the same time unless concurrency is added, such as a process that starts and interrupts half-way (by cooperatively yielding), and another process is started to execute to completion, and then the prior process is resumed. And with multi-threaded you get parallelization, where many things can run at the same time.

So in the case of concurrency and parallelization, you have the case of two or more processes which run at the same time.

;; Doing two things in parallel
(future (do-something))
(future (do-something-else))

;; Doing two things concurrently
(let [continue (do-something)] ;; do-something will yield half-way
  (do-something-else)
  (continue)) ;; call the function returned by do-something to finish what it was doing

Now, if do-something and do-something-else can be executed “at the same time”, you don’t need asynchronous programming at all. That’s right, you need absolutely no mechanism to “synchronize” them. You don’t need callbacks, you don’t need .then, you don’t need await, you don’t need <!, etc.

The reason you don’t need async programming features, is because both operations can happen synchronously, meaning, they can happen at the same time.

What if they can’t happen at the same time? Well you need asynchronous programming features then, because you need a mechanism to enforce that they do not happen synchronously, meaning that they do not happen at the same time.

So you see, if you have concurrent or parallel processes, but the order in which they execute matters, you need to make your code asynchronous, so that you can enforce the order in which they must execute.

Now where I think it gets confusing, is that we say we need to synchronize concurrent or parallel code which cannot run at the same time, and that’s true. This is because to synchronize means to make synchronous in operation. And this is exactly what we want.

If we have code that cannot be run synchronously, meaning it cannot be run at the same time, we want to transform this code into code that can be run synchronously, meaning that it becomes code that can be run at the same time. This is done by using asynchronous programming, which is the technique of taking code that was running at the same time (but shouldn’t be), and making it so it does not run at the same time, thus asynchronously.

I know this is confusing, so let me recap.

If you add concurrency or parallelization to some code, you make it so that multiple processes can run at the same time. So you made the code synchronous (happening at the same time). Now, while you made it so multiple processes do run at the same time, logically it might be that they shouldn’t be running at the same time, because that leads to race conditions and other bugs and logical error in your program. So you need a way so that the code that is now running at the same time (due to added concurrency or parallelization) is made so it doesn’t run at the same time, even in the presence of such concurrency or parallelization. So you need to make the code asynchronous (not happening at the same time).

There you go :wink:

Now you might say, but what’s the point of making the code synchronous by adding concurrency or parallelization, only to go and make it asynchronous again after, why add the concurrency and parallelization to begin with? What a good question! The answer is this would be absolutely stupid, and you’d only make the code slower overall then simply keeping it single threaded without concurrency or parallelization, except for the fact that in reality, you don’t make all of your code asynchronous, only the bits that cannot run at the same time.

What do you think?

I‘m not neccessarily disagreeing with your thoughts, but for me synchronous always meant one-at-a-time order-dependent processing. Asynchronous meant any form of coordinated/supervised execution, where synchronisation happens at specific points forced by the programmer (await 3 things, wait for 1 of 2 things, finally do another thing - the synchronisation points are in between the steps).

This means an asynchronous process is comprised of other sync or async processes, with some synchronisation points. This notion always reminded me of monads somehow, and it ‚just fit‘ in my mind.

I‘m not so sure about snychronous meaning ‚can be concurrent/parallel‘, while you’re technically right, I feel we‘re mixing up stuff here. If we take a synchronous process to mean something like ‚runs from start to finish atomically and blocking‘, two or more of those which are not interdependent can be run concurrently, but that’s not interesting: What about those interdependencies? If there are some, we need asynchronicity to schedule them correctly for the dependency requirements to be fulfilled, they cannot run concurrently or parallel.

So for tasks which are independent of each other, it doesn‘t matter if you run them in a linear/sequential fashion or concurrently, while a set of tasks with interdependencies needs async coordination to ‚wrap them up‘ into something more like one synchronous task. It seems to me the whole thing is less about how things are executed, but about a way tasks can be divided and their parts coordinated that gives more optimal execution (more dense resource utilisation over any unit of time)

I think your description works, but I think the closer parallel is synchronous vs asynchronous learning. Synchronous means the students and teachers are in the same place at the same time, whereas asynchronous means they’re not and the students can learn on their own time.

With code, the same place means the same thread. Synchronous code all runs in the same thread whereas asynchronous code doesn’t. The thread could be an explicit thread created by the program or some event handler, the result is the same.

I think this is the common interpretation, I just think it is confusing. Because for example, order dependence is inherent to your behavior, not your execution. Something that is asynchronous is also often order dependent and executing one at a time.

Compare:

(let [customer (get-customer id)
      new-customer (change-email customer new-email)]
  (save-customer new-customer))

With an async version:

(get-customer
  id
  (fn [customer]
    (-> (change-email customer new-email)
      (save-customer))))

Both of those are doing one-at-a-time order dependent processing.

I’m kind of arguing against that interpretation though. I’m doing it because I think while this interpretation appear to work, my interpretation allows a better intuition into the process (well it does for me, your mileage may vary).

I take synchronous to mean happening at the same time… Or better to say happening on the same time. While asynchronous means happening at different times, or maybe better to say happening on different timings.

Also, in my interpretation sync and async are only relevant when you’re doing more than one thing at a time.

Might be clearer if I bring in time into it. Imagine we have two timelines:

T1:    1.  2.  3.  4.  5.  6.

T2: 1.  2.  3.  4.  5.  6.

Those are out of sync, they’re happening asynchronously. You couldn’t just say that at time 1 of T1 we should get a customer, and at time 2 of T2 we should change the customer email. You can’t do that because there’s no guarantee that time 2 of T2 will be after the time that T1 will have received the customer. Even if you knew that T1 will for sure have the customer by its T2 time, their timings are out of sync, so you couldn’t guarantee that say time 3 of T2 will always be after time 2 of T1.

If we had only one timeline, that would be easy.

T1: 1.  2.  3.  4.  5.

We can say at time 1 we get the customer, at time 2 it is received, so at time 3 we can safely change the customer email and at time 4 we can save the customer and be done saving at time 5. This is now synchronous, as everything is happening on the same time when it should.

Ok, but this is naturally synchronous as well, so if getting the customer is done using the CPU clock and the customer is fetched from memory, then it’s all synchronous. But most likely the customer is fetched from a database on some other machine, so the fetching is working on its own time. You don’t know when it will be done. So you naturally have a problem where you have multiple timelines.

But because the operations cannot happen in random order, you need a way to coordinate them.

That’s where you can coordinate them synchronously or asynchronously.

Synchronous coordination means that even though they run on different timelines, you will synchronize their timelines so they are running at the same timings.

T1:    1.  2.  3.  4.  5.  6.

T2: 1.  2.     3.  4.  5.  6.

Here we’ve synchronized the timings starting at time 3. So now we can say that at time 1 of T1 we should get customers, and that at time 3 of T2 we should change email of the customer, but we also say that T2 should wait until T1 is also at time 3 before doing so.

This would be a blocking call for example. If you’re faster than the other, you wait for them to be done, so that you end up going at the same speed.

(let [customer (future (get-customer id))]
  (change-email @customer new-email))

That’s the code for it. We’re still concurrent, but synchronous. We’ve synchronized the main thread and the future thread.

Asynchronous coordination means that they can continue to operate at different times, instead of synchronizing them, we will schedule what happens when dynamically.

T1:    1.  2.  3.  4.  5.  6.

T2: 1.  2.  3.  4.  5.  6.

So we’ll say that T2 should change the email at any time after T1 is done getting the email. We don’t know what time that will be, and in the mean time, we don’t know what else T2 will do. As you see, T1 and T2 are still out of sync, and asynchronous, but we still came.up with a mechanism to enforce proper order of dependent tasks.

P.S.: I’m saying time, but I’m using it conceptually. It might be best to think of it as a series of operations. And the question is when does the next operation of T2 happen? Will it happen at the same time as the T1 operation finishes? If so, it is synchronous, if not it is asynchronous.

It’s interesting to think of “place”. I hadn’t before. But I think I disagree, as my example before shows, getting the customer and changing the email happen on different threads, yet this is synchronous.

No, it’s not. Synchronized and synchronous aren’t the same thing. With the students example, asynchronous learning is still synchronized on the end of the semester when all the work needs to be done. That doesn’t make it synchronous learning.

In your example, each thread is synchronous but the system is asynchronous. It would be synchronous if the same thread did all the work, which isn’t the case.

We might have to agree to disagree on this one. Its okay, this is all about taxonomies anyways, and I knew not everyone would agree with mine, we can each have the terminology and categorization and hierarchies that we find most useful for our problem domains and mental model.

Personally, I like to distinguish between synchronous concurrent/parallel and asynchronous concurrent/parallel, where the former is blocking and the latter is non-blocking, and in both cases concurrency and/or parallelization is involved.

That said, I similarly distinguish synchronization and synchronous. You can synchronize the timings of events using a synchronous strategy, such as having one thread wait for another, or using an asynchronous strategy, such as having one thread pick up an event from another at an arbitrary point in the future.

Hum… ok but let me ask you a question though about the way your taxonomy works. What would you consider a typical REST service using the one thread per request model?

Is that an asynchronous service, using your definition? Each request runs on its own thread, and multiple requests are made to the service concurrently. Or maybe I’m just a bit confused, you just use asynchronous and concurrent as synonyms for one another?

Yes, the server is asynchronous even though the handlers are synchronous. If the server and handlers are both synchronous, only one request can be handled at a time.

I’ve been thinking about where along the way the computer world conflated synchronous and sequential. One explanation might be that “at the same time” means “without giving up the CPU”. Another might be the definition of the adverb, synchronously, which can mean to do something at a known interval. The known interval being “when the previous instruction finishes”. I could see “the code runs synchronously, ergo it’s synchronous code”.

I’m not sure it’s relevant to this discussion, but I’ll mention that I think the futures in Java, which Clojure inherited, are pointless. They’re syntactic sugar around thread locking and a condition variable. Especially given that one of Clojure’s founding principles is to minimize or eliminate thread locks, it seems odd that futures got included.

The continuation based versions, like promises, make more sense to me. No thread locking. no additional context switching. That seems to fit better with how Clojure does things.

I did some research on this actually, here’s some of the best explanations I found:

Oddly enough “Synchronously” means “using the same clock” so when two instructions are synchronous they use the same clock and must happen one after the other. “Asynchronous” means “not using the same clock” so the instructions are not concerned with being in step with each other. That’s why it looks backwards, the term is not referring to the instructions relationship to each other. It’s referring to each instructions relationship to the clock.

This is the theory that the words were derived from digital circuits. In digital circuits, concurrent tasks are said to be coordinated synchronously when they are coordinated through a shared global clock. And they are said to be coordinated asynchronously when they are coordinated using signals that indicate completion of an operation.

I believe that the term was first used for synchronous vs. asynchronous communication. There synchronous means that the two communicating parts have a common clock signal that they run by, so they run in parallel. Asynchronous communication instead has a ready signal, so one part asks for data and gets a signal back when it’s available.

The terms was then adapted to processes, but as there are obvious differences some aspects of the terms work differently. For a single thread process the natural way to request for something to be done is to make a synchronous call that transfers control to the subprocess, and then control is returned when it’s done, and the process continues.

An asynchronous call works just like asynchronous communication in the aspect that you send a request for something to be done, and the process doing it returns a signal when it’s done. The difference in the usage of the terms is that for processes it’s in the asynchronous processing that the processes runs in parallel, while for communication it is the synchronous communication that run in parallel.

This is similar to digital circuits in a way, but here is taken from communication systems instead, though they similarly use synchronize to mean common global clock synchronization, and asynchronous to mean completion signal based.

And finally, here is my favourite, which is in-line with my definitions and interpretation of the terms I used prior:

It means that the two threads are not running in sync, that is, they are not both running on the same timeline. I think it’s a case of computer scientists being too clever about their use of words. Synchronisation, in this context, would suggest that both threads start and end at the same time. Asynchrony in this sense, means both threads are free to start, execute and end as they require.

This is how I take it, which is why I consider synchronous and asynchronous to both be startegies for concurrent synchronization. A concurrent system would simply be concurrent to me. And if it needs to coordinate some of the concurrent things it does, then depending if it employs a synchronous strategy or an asynchronous strategy is where I’ll choose if I call it synchronous or asynchronous. My interpretation actually fits with the circuit and communication one as well somewhat, as I consider only systems that use completion signals to be asynchronous (like callbacks), basically any push based model, like event driven programming. While the rest I consider synchronous strategies, and in my head the other approach do end up synchronizing the threads along the same timeline, which is where I consider it a kind of global clock synchronization. Though I’m stretching it a bit here, because there actually also exists true global clock synchronization in software.

Another aspect that I read is from Systems engineering. In there, synchronous coordination of concurrent tasks means deterministic. You can tell at what point exactly in the system one task will complete and the other start. Whereas asynchronous mean non-deterministic coordination, you can not tell when exactly something is going to start, pause, resume and end. This also bodes well for my interpretation.

One final thing I saw but can’t remember where, is that “at the same time” refers to the entire task. So if a task starts and gets interrupted before it finishes, you can say it didn’t “execute” at the same time, because the first part of the task executed at say time X and then the task finished executing later after other tasks were picked up, maybe at some later time Y. Thus asynchronous means the task doesn’t execute from start to finish at the same time. And synchronous means the tasks executed from start to finish at the same time. I also personally like this explanation, so I thought worth mentioning it.

Hopefully that brings some more insight.

I’m not sure it’s relevant to this discussion, but I’ll mention that I think the futures in Java, which Clojure inherited, are pointless. They’re syntactic sugar around thread locking and a condition variable

I don’t know why you say that’s useless? That’s exactly synchronous coordination :stuck_out_tongue_closed_eyes: And it’s how coordination has been handled for a very very long time. The only reason not to use this model is because threads use a lot of memory, and in servers that do mostly IO, that memory adds up far too much, since modern IO can handle hundreds and even thousands of concurrent IO.

Once something like Looms lends, I believe most people will go back to synchronous models like Java’s future. Since you’ll get the same benefits of determinism, at no memory overhead.

Sources:

This is what I was trying to say earlier basically. Sharing a clock/synchronicity implies sequential blocking processing, multiple clocks enable asynchronous coordination and thereby concurrency

This topic was automatically closed 182 days after the last reply. New replies are no longer allowed.