Concurrency

Another thing that comes to real world software applications is concurrency. Concurrency is the ability of an application to handle multiple things at once. Concurrency is how your system performs computation or handles data for multiple requests/entities together.

Concurrency vs Parallelism: Concurrency is handling multiple things at once; they maybe out of order but the system is trying to accomplish multiple things together. Parallelism is the ability of system to actually perform multiple things at once; in other words, the system actually computing multiple tasks at once is parallelism.

Advantages

  • High Throughput: Achieves higher number of tasks completions in a given time, as compared to sequential execution.

Throughput: Throughput is the rate of completing tasks in a fixed time. A lot of tasks have waiting time like waiting on IO resources, or waiting from response from another service. Utilising this waiting time to execute other tasks instead of waiting for it, helps complete multiple tasks together, and hence increases throughput.

  • Less waiting time: Increases the responsiveness of the system, since the system is less idle and utilises the waiting time in IO operations to be utilised by other tasks.

  • High CPU utilisation: Directly analogous as above

Implementation

Let's talk about the different levels at which concurrency can be achieved.

  • OS Level: OS achieves concurrency by maintaining process contexts, moving idle processes to waiting state, and utilising kernel threads and multiple CPU cores for its purposes. OS locks the shared resources used to avoid any race condition.

Race Condition: Race condition occurs when a state is shared by two or more threads, and they try to change it at the same time. This can cause unpredictable behaviour, since the timing of their execution is uncertain. Race conditions can be avoided by using proper synchronized blocks in critical sections of applications.

  • Programming language: Programming languages support concurrency via use of threads, locks, synchronization, etc. Languages commonly use Futures(eg: Scala, Java, JS) and synchronized blocks to abstract it out to programmers and hide the actual implementation.

Synchronized: When multiple threads try to access the same resources and produce erroneous and unforeseen results, creating a synchronized block will make sure that only one thread is executing that block, and blocks the remaining threads.

  • Message Passing: For cases, where explicit communication is required to handle concurrent systems, messages are exchanged between them. This could be synchronous or async, where synchronous communication is blocking on the response from receiver, while in asynchronous communication, the sender doesn't wait for the response.

For More Reading

Last updated

Was this helpful?