Resolution vs Image Quality on a PC Monitor

A few years ago, when most of us had 1024 x 768 or even 1280 x 1024 pixel resolution monitors, manufacturers would tell us that it looked blurry and that to have the best image quality we needed to jump to Full HD resolution (1920 x 1080). Now 4K is on everyone’s lips, it turns out that it is Full HD that looks blurry, and that we need to buy a higher resolution monitor to have the best image quality . To what extent is this true?

We can usually think of this as a marketing strategy, and it really is precisely that: they have created better monitors and want to sell them, so they tell us that what we have now is not worth it and that we need to spend money on their new product. But, in reality, we are also improving the image quality … or is it not?

Resolution vs Image Quality on a PC Monitor

Does higher resolution mean better image quality?

If we stick to the empirical data, a Full HD screen has 1920 x 1080 pixels, or what is the same, slightly more than 2 million pixels. An Ultra HD display has a resolution of 3840 x 2160 pixels, about 8.3 million pixels, four times more. This means that in the same space we are integrating many more pixels, so what we will have is a higher definition , as long as we are talking about the same screen size.

Full HD vs UHD

And this is an important piece of information, the size of the screen, because what gives us definition is not the resolution or the number of pixels, but the pixel density.

Pixel density is the key factor for definition

The density of pixels in a monitor is measured in dots per inch (DPI in English for “dots per inch”) but since it refers to the number of dots that are within a line of one inch of scan, they were discontinued in favor of PPI (pixels per inch). While PPI is the correct term to refer to monitors, the two are often used interchangeably.

Puntos por pulgada

Pixel density is important because it is what determines image quality in the sense that, in general, higher density will create sharper images. Let’s give some examples taking as a reference a 27-inch monitor, which is quite common and normal today:

  • A 27-inch 720p monitor would have about 54 PPI.
  • A 27-inch 1080p monitor has a density of about 81 PPI.
  • If the monitor has 1440p resolution, its density would be about 108 PPI.
  • If we go to 4K resolution, the density rises to 163 PPI.
  • A 27-inch 8K monitor would have a density of 326 PPI.

To put this data in perspective, imagine that you have two monitors, one next to the other, and both with Full HD resolution. If one of them has a size of 32 inches while the other is 27 inches, if you zoom in you will see an obvious difference in terms of the size of the pixels, since although both monitors have the same amount, the density of the 27 inches will be much larger, its pixels will be smaller and therefore it will provide better definition.

Another example when it comes to density: imagine a 1000-inch 4K monitor (exaggerating). This being the case, we will see that the pixels have a considerable size because the density would be very low, and therefore the image quality will be bad no matter how much 4K it is.

So the higher the pixel density the better?

The answer is yes and no. In general, a higher pixel density is better because it provides better image definition, but there is a certain point where performance decreases. As the density increases more and more, the observable advantages of that higher density are less and less evident to the point that they are imperceptible to the human eye.

In the example above, the 27-inch Full HD monitor will have a density of about 81 PPI, while the 32-inch monitor will have 69 PPI. In this situation it is safe to say that there will be observable differences between the two monitors, but if we were talking about two 24-inch monitors, both of them, one with 4K resolution and the other 8K, the difference would be imperceptible, and yet obviously processing images at 8K resolution has a much higher performance cost than doing it at 4K.

At this point, the exact point of density at which the human eye stops perceiving changes is up for debate. Some experts say that this figure is around 400 PPI, others say 1000, and most users settle for even less than 200. Whatever that point, what is evident is that there comes a certain point in time. that a higher pixel density is no longer appreciated.

Image quality, tied but not tied to resolution

Returning to the main topic, according to Dolby, the image quality that people perceive (because after all it is a subjective matter of perception) depends mainly on three factors:

  • The number of pixels (and their density, as we have explained).
  • The rate of frames per second.
  • The performance of pixels.

It is on this last point that monitor panel manufacturers have been emphasizing in recent times, since -according to Dolby- if it is achieved that even with the same number of pixels these are capable of representing a greater dynamic range and with a larger color space, the reproduction quality is improved. And as far as we are concerned, we could not agree more, since not everything is resolution when we talk about image quality but in terms of user experience, FPS and associated technologies, such as HDR for example, greatly influence. .