Synchronous and Asynchronous Programming Paradigms
An algorithm is defined broadly in the Merrian-Webster Dictionary as a ‘step-by-step procedure for solving a problem or accomplishing some end’. Step-by-step is commonly understood to imply that the steps are executed sequentially, that is to say, step 0 at time instant 0 and step 1 at time instant 1, etc.
Asynchronous programming is very difficult to grasp because it introduces the idea that there could be more than one line of execution running at the same time, which means you might have situations in which step n and step n+1 of your algorithm are executed at the very same instant t. The following image presents an approximation of both synchronous and asynchronous models. Note the performance gain obtained by implementing an asynchronous solution:

Figure 1.1: An oversimplified timeline comparison between synchronous and asynchronous solutions
The consequences of this are huge: with the right design, algorithms can be executed in dramatically less time, freeing up resources and mental energy for programmers and companies alike.
Asynchronous programming poses a number of challenges that must be understood if we are to unlock the full potential of these new algorithms. (How do you split the tasks you want to execute in parallel? What happens if one task ends before another? Who coordinates the tasks? Etc.) That’s why we start this book with a discussion of the core concepts that a developer must understand to get started:
- Synchronous and asynchronous programming
- Operating system process and threads
- Green threads, coroutines and fibers
- Callbacks, promises and futures
- Challenges of asynchronous programming