Concurrency is not free!
Another aspect of concurrency is managing the lifecycle of threads. OS does the dirty job of creating threads and managing them on behalf of your platform (or runtime environment). There are certain limits on the number of threads which can be created at the system level. So definitely, proper thoughts should be given on how many threads are required to accomplish a job.
Don't blindly decide to execute task concurrently!
Modern libraries provide a wonderful abstraction for programmers, so doing certain task concurrently or asynchronously is quite trivial. It is as simple as instantiating an object and calling few methods on it, and you are done! These libraries are abstracted in such a way that they don't even remind to programmers that you are going to deal with threads. And this is where the lazy programmer can take things for granted.
You need to process 100 task, create 50 threads.
Collection<Task> task = fetchTasks(); //from somewhere
int numberOfThreads = 50;
obj.executeConcurrently(tasks, numberOfThreads);
In object oriented world, all it takes is a method call.
You need to process 100 task, create 50 threads.
Collection<Task> task = fetchTasks(); //from somewhere
int numberOfThreads = 50;
obj.executeConcurrently(tasks, numberOfThreads);
In object oriented world, all it takes is a method call.
To understand the cost of concurrency, let's take a step back and ask yourself how is it implemented? It is implemented through locks. Locks provide mutual exclusion and ensure that the visibility of change occurs in an ordered manner.
Locks are expensive because they require arbitration when contended. This arbitration is achieved by a context switch at the OS level which will suspend threads waiting for lock until it is released. Context switch might cause performance penalty as OS might decide to do some other housekeeping job and so will lose the cached instruction and data. This is even more evident in multicore CPUs where each core has its own cache. In the worst case, this might cause latency equivalent to that of an I/O operation.
Another aspect of concurrency is managing the lifecycle of threads. OS does the dirty job of creating threads and managing them on behalf of your platform (or runtime environment). There are certain limits on the number of threads which can be created at the system level. So definitely, proper thoughts should be given on how many threads are required to accomplish a job.
Don't blindly decide to execute task concurrently!