The push for higher resolutions rages on, and 4K has quickly become the standard for a high-end gaming PC. When I was fortunate enough to find an RTX 3090 in stock, which I knew was a capable 4K graphics card, I knew I had to upgrade my monitor to go with it.
But after using 4K monitor for a few months, I’m already back to 1440p. Here’s why I don’t plan on going back.
The RTX 3090 situation
I’ve already drawn the ire of tired gamers hunting for a graphics card during the GPU shortage, but I still need to set the scene. I own an RTX 3090. It’s not a review sample, I didn’t get it for free, and I didn’t buy it through some strange connection. I saved up for nearly a year, waited patiently as my attempts to buy a graphics card were thwarted by scalpers, and eventually waited in line for nearly four hours at a local Best Buy restock.
I’m lucky, even considering how much time I spent trying to hunt down a graphics card. And although I’d never spend $1,600 on a graphics card normally, that price didn’t seem too bad when stacked up against scalper rates. I consider it an investment — hopefully I won’t have to go through this process again next generation.
The RTX 3090 is the most capable 4K graphics card currently on the market. And even with the RTX 3090 Ti looming, it’ll likely stay that way until the next generation arrives. I spent a lot of money on my RTX 3090, and I wanted to finally upgrade from 1440p to 4K. But even with the most power you can get from a GPU right now, 4K still doesn’t feel worth it.
Get your performance in check
Consoles screw everything up. In the final years of last-gen consoles, 4K gaming was all the rage. Of course, the Xbox One and PlayStation 4 aren’t powerful enough for real 4K, so games made for those platforms use some upscaling method to approximate 4K. Most PC games don’t have that luxury.
There are tools to help with this, such as Nvidia Image Scaling and AMD Radeon Super Resolution, but for the most part, your PC will render every pixel for whatever resolution you select. And 4K is a lot of pixels — around 8.3 million to be exact– while 1440p is significantly lower at 3.7 million and Full HD is just over 2 million.
That’s tough on PCs, even ones packed with an RTX 3090. My personal rig has an Intel Core i9-10900K, the RTX 3090, and 32GB of memory. In my main game, Destiny 2, I hover between 70 and 80 frames per second (fps) at 4K with my RTX 3090. In demanding AAA titles like Assassin’s Creed Valhalla, it just barely skirts past 60 fps.
Those results aren’t bad, but they’re not what I expected from what it supposed to be the most performant GPU on the market. Bumping down to 1440p is much more forgiving — 95 fps in Assassin’s Creed Valhalla, and a locked 144 fps to match my refresh rate in Destiny 2.
Simply put, 4K is still too demanding, even for top-of-the-line hardware. The good news is that you can have your cake and eat it, too. By understanding the pixel density of your monitor, you can improve performance without giving up much visual fidelity.
Forget resolution — talk pixel density
Resolution is the name of the game with monitors, but you should always consider it in the context of pixel density, which helps determine how large pixels are physically for a given screen size. Take a 4K 55-inch TV and a 4K 32-inch monitor, for example. Both have the same number of pixels, so the pixels on the 55-inch TV will be larger.
The lower the pixel density, the easier it is to make out individual pixels. We want high pixel density, where the individual pixels are smaller and therefore harder to make out. Resolution and display size scale pixel density oppositely, so you can end up with a similar pixel density at two different resolutions.
In my case, I moved from a 32-inch 4K display to a 27-inch 1440p one. A 32-inch 4K display has a pixel density of about 138 pixels per inch (PPI). A 27-inch 1440p display, meanwhile, has a pixel density of around 109 PPI.
That’s a bit lower, but remember that 4K equals about 8.3 million pixels, while 1440p only has about 3.7 million pixels — less than half. Sure, my 27-inch 1440p display doesn’t have the same pixel density as my old 32-inch 4K one, but it’s damn close considering the pixel gap between the two resolutions.
Also consider that most TVs have far lower pixel density than monitors. A 65-inch 4K TV only has a pixel density of about 68 PPI. You sit closer to a computer monitor, but the point remains: You really don’t need a 4K monitor for great image quality.
Technical bits aside, the point about pixel density is that it’s absolutely vital to consider the screen size for a given resolution. In the case of my two monitors, the 27-inch 1440p one looks nearly as sharp as my old display because it’s smaller. Understanding pixel density allows you to achieve the image quality you want without just grabbing the highest resolution you can afford.
Still not prime time for 4K
Native 4K still puts the latest hardware back in its place. Only a small number of graphics cards can even manage 4K — the RTX 3090 and 12GB RTX 3080 among them — in the most demanding titles, and the trade-off in performance compared to 1440p usually isn’t worth it. Take into account pixel density, and you can have image quality and performance without compromising.
With more than double the pixels as 1440p, 4K still offers more details at the same screen size. That extra detail doesn’t usually matter on common monitor sizes, and for gaming, performance is still king.
Editors’ Recommendations