Concurrency refers to the ability of different parts or units of a program, algorithm, or problem to be executed out-of-order or in partial order, without affecting the final outcome. This allows for parallel execution of the concurrent units, which can significantly improve overall speed of the execution in multi-processor and multi-core systems. 1
To run a program the operating system must allocate memory and creates a process. A processes have stack, heap, data segment and text segment in user space. The CPU context and file descriptor table are kept in kernel space. When executing the program, the program counter (PC) jumps around in the text segment.
A process can have one ore more threads of execution. Threads share heap, data segment, text segment and file descriptor table but have separate stacks and CPU contexts. Each thread have a private program counter and threads executes concurrently. Depending on how threads are implemented, the CPU contexts can be stored in user space or kernel space. In the below figure a process with three threads is shown.
If not explicitly stated otherwise, the term task will be used to refer to a concurrent unit of execution such as a thread or a process.
In concurrent programming, an operation (or set of operations) is atomic if it appears to the rest of the system to occur at once without being interrupted. Other words used synonymously with atomic are: linearizable, indivisible or uninterruptible. 2
Incrementing and decrementing a variable are examples of non-atomic operations. The following C expression,
X++;
, is translated by the compiler to three instructions:
X
from memory into a CPU register.X
and save result in a CPU register.Similarly, the following C expression,
X--;
, is translated by the compiler to three instructions:
X
from memory into a CPU register.X
and save result in a CPU register.X
back to memory.A race condition or race hazard is the behaviour of an electronic, software or other system where the output is dependent on the sequence or timing of other uncontrollable events. It becomes a bug when events do not happen in the intended order. The term originates with the idea of two signals racing each other to influence the output first. 3
A data race occurs when two instructions from different threads access the same memory location and:
Concurrent accesses to shared resources can lead to unexpected or erroneous behavior, so parts of the program where the shared resource is accessed are protected. This protected section is the critical section or critical region. Typically, the critical section accesses a shared resource, such as a data structure, a peripheral device, or a network connection, that would not operate correctly in the context of multiple concurrent accesses.4
In computer science, mutual exclusion is a property of concurrency control, which is instituted for the purpose of preventing race conditions; it is the requirement that one thread of execution never enter its critical section at the same time that another concurrent thread of execution enters its own critical section.5
Often the word mutex is used as a short form of mutual exclusion.