Will the cloud replace hardware?
Cloud gaming has come a long way over the past decade, and today it’s a relatively solid alternative to native play. Geforce Now, Xbox Game Pass, PlayStation Plus—they all offer relatively affordable options for playing large libraries of PC and console games without needing to own the proper hardware… or, in some cases, even the games.
As of this writing, cloud gaming is far from replacing local gaming. There are a number of factors as to why that is, but for the gamers in this study, it’s mainly due to latency problems. 62% of respondents (mostly Millenial and Gen Z) said they would switch to cloud gaming full-time instead of playing on their own hardware if latency were “eliminated.”
Unfortunately, that’s just not going to happen. While modern hardware and networking is fast, there’s just no beating the physical immediacy of local rendering on your own machine.
However, when the question was asked more broadly of respondents, a sizeable number (42%) said they’d skip upgrading their graphics cards if “their needs were met” with either cloud gaming or AI upscaling. That’s a much more achievable goal for cloud providers who want to deliver a premium remote gaming experience.
Around 20% of Millennial and Gen Z gamers believe that high-end GPUs will become less essential in the next three years because of cloud gaming and the growing improvements to AI upscaling like DLSS and FSR. Meanwhile, nearly 60% who are still holding out for a GPU upgrade to improve their gaming experience.
I’m not entirely sold on the idea of AI upscaling being everything, but frame generation has made some impressive leaps lately. If Nvidia keeps its focus on AI and can’t figure out how to keep its GPUs in better stock, we may all be relying on more cloud and AI features before long.