Abstract:
Parallel computing is a form of computation in which many calculations are made at the same time. It is based on the principle that large problems can be divided in smaller ones. After the solutions of all small problems are found, they are combined to obtain the solution to the initial problem There are different forms of parallel computing: bit-level, instruction level, data, and task parallelism. Parallelism is used mainly in high-performance computing. Nowadays it became so important because of the physical constraints preventing frequency scaling. Parallel computer programs are more difficult to write than sequential ones, because there appear other types of software bugs, of which race conditions are the most common. Communication and synchronization between different tasks becomes very important in gaining performance increase.