Concurrency in Operating System refers to the execution of several programs at the same time. It takes place in OS when multiple processes are executing in parallel. It is the execution of processes to provide an impression of a synchronous computation.
In Concurrency, resources are shared among the processes that result in troubles such as starvation and deadlocks. It supports mechanisms like memory allocation, execution of processes, and computation scheduling to maximise throughput.
What is Concurrency in Operating System?
Concurrency in operating systems refers to the ability of the system to manage multiple operations or processes simultaneously. It involves the execution of several process sequences over time, in overlapping time periods, rather than executing them sequentially.
Concurrency aims to maximize the utilization of computing resources by ensuring that these resources are not left idle while there are tasks waiting to be executed. This is a fundamental aspect of modern operating systems, enabling them to handle various tasks such as running applications, executing background services, and responding to user input concurrently.
Principles of Concurrency
1. Process Interaction
Concurrent processes can either run independently or interact with each other. Interaction between processes can occur through shared data or through message passing. Managing this interaction carefully is crucial to avoid errors and ensure data consistency.
2. Shared Resources
Concurrency involves sharing resources among multiple processes. Resources can include CPU time, memory, files, and I/O devices. The operating system must ensure that shared resources are accessed in an orderly manner, preventing conflicts and ensuring fairness among processes.
3. Synchronization
Synchronization mechanisms are used to coordinate the execution of concurrent processes. This includes ensuring that processes execute critical sections of code that access shared resources in a mutually exclusive manner. Common synchronization tools include mutexes, semaphores, and monitors.
4. Deadlock and Starvation
Concurrency can lead to deadlock, where a set of processes are all waiting for resources held by each other, and none of them can proceed. Starvation occurs when a process is perpetually denied necessary resources to proceed. Operating systems implement various algorithms to prevent or resolve deadlocks and avoid starvation.
5. Concurrency Control in DBMS
In the context of database systems, concurrency control is crucial to ensure the integrity of data when multiple transactions occur simultaneously. Techniques such as locking, timestamp ordering, and optimistic concurrency control are employed to manage concurrent access to the database.
6. Process and Thread Management
Modern operating systems support not only processes but also threads, which are lightweight processes that share the same address space. Managing threads involves less overhead than managing processes, allowing for more efficient concurrency on multicore systems.
7. Parallelism vs. Concurrency
While related, parallelism and concurrency are distinct concepts. Parallelism involves performing multiple operations at the exact same time, often using multicore processors. Concurrency, on the other hand, focuses on managing multiple operations at overlapping times, even if not all operations are being executed simultaneously.
Problems in Concurrency
1. Race Conditions
A race condition occurs when the outcome of a process depends on the sequence or timing of uncontrollable events. It happens when multiple processes or threads access shared data concurrently, and the final result depends on the order of execution. This can lead to unpredictable results and bugs that are hard to replicate and fix.
2. Deadlocks
Deadlocks happen when two or more processes are each waiting for another to release a resource that they hold, creating a cycle of dependencies that prevents any of them from proceeding. This situation leads to a standstill, where processes can’t move forward with their tasks.
3. Starvation
Starvation occurs when a process or thread is perpetually denied the resources it needs to proceed, often because other processes are continuously being prioritized over it. This can lead to situations where some processes are never able to complete their tasks, affecting the system’s fairness and efficiency.
4. Livelock
Livelock is a situation where two or more processes continually change their state in response to changes in the other processes without making any progress. It’s similar to deadlock in that no progress is made, but the processes are not blocked — they are simply too busy responding to each other to do actual work.
5. Priority Inversion
Priority inversion happens when a higher-priority process is forced to wait because a lower-priority process holds a resource it needs. The problem becomes severe if a medium-priority process preempts the lower-priority process, indirectly preventing the high-priority process from proceeding.
6. Threading Issues
Managing multiple threads within the same application can introduce several issues, such as difficulty in ensuring that threads are properly synchronized, challenges in debugging and maintaining thread-safe code, and the overhead associated with context switching between threads.
Advantages of Concurrency
- It allows running many applications at the same time.
- It allows for the better functioning of the operating system.
- It provides a better average response time.
- It allows that when resources are not in use by one application, they can be used by other applications.