Concurrency vs Parallelism
Concurrency and parallelism are two concepts in computer science that are often used interchangeably but have distinct meanings. In essence, concurrency is kinda like 2 "larger" tasks running at the same time, but no two subtasks are processed at the same time. Parallelism is when two subtasks are processed at the same time.
Concurrency
Concurrency refers to the ability of a system to handle multiple tasks at the same time, seemingly in parallel. However, in reality, the tasks are not executed simultaneously, but rather the system switches between tasks so quickly that it creates the illusion of parallelism. Concurrency is achieved through the interleaving operation of processes on a single processing unit, typically through context switching. It is used to decrease the response time of the system and increase the amount of work finished at a time
Parallelism
Parallelism, on the other hand, involves the actual simultaneous execution of multiple tasks or subtasks of the same task on multiple processing units, such as multiple processor cores or distributed systems. Parallelism is about executing tasks in parallel to increase throughput and computational speed, and it is essential for performance gain. It allows independent tasks of a program to run simultaneously on different processing units, leading to improved speed and efficiency In summary, concurrency creates the illusion of parallelism by interleaving tasks on a single processing unit, while parallelism involves the actual simultaneous execution of tasks on multiple processing units to increase throughput and computational speed