Basically HPC is high performance computing, which brings together multiple computers, software, and expertise to solve problems too difficult to solve by other means within a specific time frame. HPC is something that emerged after the concept of supercomputing. While a supercomputer is a subset of the high performance computer, supercomputing is actually a subset of HPC (High performance computing).

HPC is a measure of performance. HPC is not a computer. The need of HPC arose with the rate of development demanding higher computing power for more demanding problems. A single multi-core, 64 bit, 32 GB RAM machine which most of us believe to be one of the most powerful hardware resources available could only execute one routine rather one problem solving in a dedicated period of time. The physical limits to the speed of a single processor, and storage media capacities in terms of time resources is the basic limitation of this type of sequential processing.

The alternative is to move to parallel processing – use relatively fast, multiples of, cheap processors in parallel. Meaning, concurrent use of multiple processors for the purpose of processing voluminous data, either by running the same program on each processor or even running different programs on each processor. With this technology you can probably get 100x (times) performance. This brings the concepts of Super Computing, growing towards parallel supercomputers evolving into HPC (High Performance Computing) – use a range of small cluster of PCs or Supercomputers to compute – revolutionizing the field of science, changing the outlook of researchers in their respective fields.

Applications like bio-chemistry, chemistry, physics, environment modeling, industrial products and services generally deal with a large range of data and require phenomenal archival storage facilities. From Kb, to Mb to Gb to Tb we have now moved to Pb (Petabytes – millions of Gigabytes). HPC is about the only method that can prepare a platform for such high resource computing power with such voluminous data to be stored for analysis and evaluation of products under development. Recently it has come to awareness that HPC is also being applied to business use like in data warehouses, transaction processing, etc.

HPC is basically computer technology focusing around the development of supercomputers and software in parallel processing algorithms for scientific and commercial use. This is achieved by the simultaneous use of multiple processors executing programs that are divided into multiple small pieces delivering, a co-existing unified result in a very small time frame.

Typically operating systems and the open source environment play a very large role in the development of HPC worldwide. Naturally, the 100x concept of HPC will gain advantage using low cost hardware systems and free open source operating systems.