Could GDDR6 Replace the Main RAM of the PC?

One of the most open questions in certain sectors after the presentations of the specifications of SONY and Microsoft with their PS5 and Xbox Series X is, why in PC does not use unified memory? that in short, it comes to raise a secondary question at the same time: why is GDDR6 memory not used as the main PC if the consoles succeed? It is a good question that we are going to answer next.

RAM memory has always been in question for over 20 years. It is still the main bottleneck in a PC (SSD logically apart) and where the rest of the main components gradually increase their performance, manufacturers only get small improvements that they charge at a gold price.

Could GDDR6 Replace the Main RAM of the PC?

But if we look at its sides, the vast types of current memories can leave this one turning pale and the consoles to each generation emphasize that it is possible, but why not?

Unification is a step backwards for a common PC

APU de una Xbox Series X

Although it may seem the opposite, memory sharing is a step backwards in several aspects for a PC. It is true that GDDR has been much faster for generations and that the new GDDR6 and GDDR6X is one more leap towards it, but the needs to be covered by a PC are mostly not related to this factor.

As a very famous slogan said precisely: “power without control is useless”, and this on a PC is fulfilled almost perfectly. A PC is made up of a CPU, RAM and GPU as main actors that work in different specific areas , so replacing DDR4 RAM with GDDR6 implies a basic problem: access times and availability.

Consoles use this system for one reason only, cost reduction, but it really is a pretty poor system from the CPU point of view. Although the bandwidth is spectacular in GDDR6, the latencies are too high to work with a CPU with high frequency and IPC.

Caches are extremely fast and require continual watering of information so CPU registers, ALUs and algorithms can perform without downtime, impairing performance and efficiency. In addition, what we see as a general rule with GDDR6 is its bandwidth, which is only used to send information to the GPU and its peak is only reached in copy, but it leaves out reading and writing , where its performance drops sharply. .

Without a very low latency, the CPU performance would be greatly affected, as it happens to the APU cores of the consoles, where the performance of the iGPU is also worsened by sharing resources with other devices.

GDDR6 would cause compatibility and upgrade issues

GPU Xbox Series X

The GDDR6 is a type of memory that has very limited electrical signals to achieve these speeds. This would involve two direct things on a PC: implement them on the motherboard as is done with the GPUs currently on their PCBs, or create an exclusive bus with external modules that would have to be soldered to it.

The harmonic noise of the signal, jitter and access times would prevent a format like that of common RAM memories (slot), based on replaceable removable modules that allow updating.

GDDR6 involves a PCB of much higher quality to maintain the integrity of the signal in each module, where also CPU and GPU would have to be incredibly close (physically speaking) to the modules of this type of memory. The interconnection would have to be done with a direct bus, without PCIe in between, something complicated with SSDs today.

Interior PC

Of integrating them supposes another latency, since the access and download of data from said bus is very penalized, not in vain the lack of VRAM in a current GPU leads to more or less illustrative performance loss with the RAM of the system.

This would bring one more problem: updating that type of memory. Such a system implies a change of motherboard and GPU at the same time, since the latter would have to be included in the first (soldered), so the modularity and upgrade of them would be considerably reduced and the overall cost would increase.

Therefore, opting for such a change would be totally counterproductive to performance and being a complete modular system, it would limit the change of components and would also make them more expensive.