Many times you have heard or read the acronym HPC in many hardware news, especially related to supercomputers and data centers. What does HD mean and what differentiates its definition from the rest of conventional systems? We explain it to you in detail.
Pesa with its fish seems to us to be the most powerful in the world, even those with an i9 or a high-end Ryzen 9 also equipped with high-performance graphics, there are problems that a conventional PC cannot solve and that it is necessary that they be. for various applications in day to day and technological and social progress. This is where HPC systems come in.
What does HPC mean?
The acronym HPC is short for High Performance Computing, which in Spanish would be translated as high performance computing and under this umbrella term a whole world is included within computing. Although it is often used as a synonym for supercomputer, we usually call HPC systems made up of clusters of processing powers found in large numbers and working in parallel to solve complex problems that require high speed.
The utility of the HPC is wide, but outside the domestic market, since it is being used today for jobs such as the development of new drugs, the automated trading of shares, the creation of simulations for future self-driving cars, etc. Therefore, it is developing the solutions that we will see in the future on our PC.
Two examples of its use
The first example is the need for automated cars after processing a large amount of information from radar sensors and GPS systems in real time to make crucial decisions in milliseconds. In order for the power of an HPC system or supercomputer to end up in the hands of a home user, the time required is more than a decade. By the way, now it takes all traffic into account and you will see how far we are still from all traffic being fully automated.
Urban simulations are also used where the flow of the daily movement of a city is represented to find out if the urban plans will not affect the daily life of citizens when planning a future work or remodeling. Therefore, in general, they are used to solve problems that, due to the volume of data and speed, a conventional PC falls short of.
Processors and special units
The capacity to represent data in computing is limited by its number of bits, the greater the number of them, the greater the precision. This is especially important for the scientific and engineering world, where a wrong rounded decimal can result in a fatal error and throw an experiment to ruin or make the simulation completely invalid. That is why the ability to move and execute data under a specific type is usually measured, this being a 64-bit float or double precision.
In home systems, except for memory addressing, 32-bit is often used, due to the fact that it is good enough for average tasks and the fact that going to 64-bit means doubling the number of transistors to do the largest number of tasks the processor either for execution tasks or various registers.
This not only happens with processors, but also with GPUs, where the HPC models have 64-bit or FP64 floating-point units, while gaming models do not find such units. In addition, in recent times specialized units and accelerators are being incorporated to quickly execute certain algorithms typical of this type of computing. These improvements not only occur in terms of CPU, but also graphics chips, and other supporting coprocessors.