Although the expectation is highest for knowing the performance of the new NVIDIA graphics cards with Ampere architecture, there is a factor that is not being taken into account when talking about them. The power will increase to a greater or lesser extent, but the data that is being leaked does not seem to offer an increase in the amount of VRAM, will it be enough to play at 4K and 144 Hz?
When a person is about to buy a high-end, or medium-high-end graphics card, one of the factors to look at is precisely the amount of VRAM. Although as we always say the current engines segment the textures with direct RAM access through their own RAMCache systems and the sea of varied ones, the hardware requirements for very high resolutions and hertz are starting to skyrocket.
Will 11 or 12 GB of VRAM be enough for Ampere?
Leaving TITANs aside for obvious reasons, the market has been stagnant at 11 GB since 2017 for the high-end NVIDIA. The problem is that there is a casuistry that almost no game consumes that amount, or to be more specific, practically no game is capable of loading so many textures into its VRAM.
This is not a coincidence logically and is due to what was discussed two paragraphs above, since the developers prefer to fall short than go long in terms of exporting textures and leaving them ready in said memory, so they opt for the system RAM with a view to the post processing and rendering of scenes.
Ray Tracing and DLSS 2.0 seem to be the great incentives for us to rethink the necessary amount in X years, and especially to the fact that 4K monitors with 144 Hz are gradually gaining market and with this, prices are decreasing. So the most obvious question is, do we need 12 or 16GB for the foreseeable future?
Consoles could lead the way, or maybe not?
Although both SONY and Microsoft include 16GB of GDDR6 VRAM , it is not really clear how the former will use this unified memory, while the latter works in two sections: 10GB and 8GB.
This would leave us with a scenario more or less similar to the current one we could think, but the requirements for 4K and 144 Hz are much higher and the consoles obviously will not see it until the new generation. Many users report very high consumption of VRAM, so Ampere could or should include more of this type of memory, but at the same time we have to have other key factors:
- Total bandwidth.
- GDDR6 speed.
Although it may not seem like it, a higher bandwidth with a higher bus or with a higher speed of GDDR6 means that the amount of textures that have to be stored is less because they work at a higher overall speed within the GPU. This also adds less latency when carrying out such work and has an impact on the amount or export of textures being reduced with the system RAM.
Therefore, the amount of VRAM may be the same on an Ampere card, but as we have seen on other occasions when facing architectures, the new algorithms and the greater bandwidth allow a considerable reduction in VRAM, you just have to remember Maxwell with improved color compression.
Therefore, this debate is more for the future than it is today. Is it worth paying more for a model with more VRAM? Yes, always, but is it worth paying more for a high-end model with more VRAM? That is the real question and it is something that everyone will have to value with their pocket and the time they intend to keep the card.
As a general rule, if we are one of those who change every 4 or 5 years (or more), VRAM will always be welcome, if instead we update GPU in fewer years, VRAM is not as decisive and the general power of the graphics card is. In herself.