Concurrency
Another thing that comes to real world software applications is concurrency. Concurrency is the ability of an application to handle multiple things at once. Concurrency is how your system performs computation or handles data for multiple requests/entities together.
Advantages
High Throughput: Achieves higher number of tasks completions in a given time, as compared to sequential execution.
Less waiting time: Increases the responsiveness of the system, since the system is less idle and utilises the waiting time in IO operations to be utilised by other tasks.
High CPU utilisation: Directly analogous as above
Implementation
Let's talk about the different levels at which concurrency can be achieved.
OS Level: OS achieves concurrency by maintaining process contexts, moving idle processes to waiting state, and utilising kernel threads and multiple CPU cores for its purposes. OS locks the shared resources used to avoid any race condition.
Programming language: Programming languages support concurrency via use of threads, locks, synchronization, etc. Languages commonly use Futures(eg: Scala, Java, JS) and synchronized blocks to abstract it out to programmers and hide the actual implementation.
Message Passing: For cases, where explicit communication is required to handle concurrent systems, messages are exchanged between them. This could be synchronous or async, where synchronous communication is blocking on the response from receiver, while in asynchronous communication, the sender doesn't wait for the response.
For More Reading
Last updated
Was this helpful?