I have seen the future of PC gaming on Nvidia hardware, and it looks pretty slick.
But I have to admit, I’m left wondering: is it worth the price?
Last week, I stopped by the Game Developers Conference in San Francisco to meet with Nvidia, so I had a chance to see laptops and PCs packing Nvidia GeForce RTX 50-series cards running some of the latest games.
GDC demos are always interesting because the event is for game developers, not journalists, so there’s often a lot of talk about topics like rendering pipeline optimization. And sure enough, I got a few demos showcasing how Nvidia is improving its software suite to help devs make better-looking games with less work.
I also had the chance to see and go hands-on with games like Avowed, Cyberpunk 2077 and Hogwarts Legacy at special demo stations designed to showcase Nvidia’s new 50-series cards, which is where I finally got to glimpse how these cards will handle in real life in some of the best gaming PCs.
As you might expect, these stations had two nearly identical PCs set up running the same game. Each set of PCs had only one difference between the two: either one was using an older RTX 40-series card while the other had a 50-series version, or the two PCs would be identical save that one would have a specific Nvidia feature enabled while the other would not.
These GDC setups gave me the opportunity to look closely and see exactly how much difference enabling DLSS 4 features like Multi-Frame Generation makes when you’re playing a game. And the quick answer is, it seems like it does make a difference and gives you a significant framerate boost.
When I first read about Multi-Frame Generation, which inserts up to 3 “AI-generated” frames of gameplay for every frame of “real” gameplay your GPU generates, I was a little concerned turning it on would make games feel imprecise and slower to respond.
But during my short time testing demos at GDC, I didn’t notice any real difference in the way games like Hogwarts Legacy “felt” to play with Multi-Frame Generation on or off. The game also moved smoothly, with no noticeable graphical bugs or framerate dips to indicate Multi-Frame Generation was enabled.
Enabling DLSS 4 with Multi-Frame Generation while playing Hogwarts Legacy with an Nvidia GeForce RTX 5070 Ti nearly tripled the framerate without any noticeable dip in image quality.”
And golly, what a framerate. Nvidia had analytics tools running at every station that displayed a real-time readout of data like frames per second, GPU/CPU utilization and input lag, so I didn’t have to rely on my bad eyes to try and suss out how many FPS each game was hitting.
According to Nvidia’s data, enabling DLSS 4 with Multi-Frame Generation while playing Hogwarts Legacy on a rig with an Nvidia GeForce RTX 5070 Ti nearly tripled the framerate (from 57 FPS to 221 FPS) without any noticeable dip in image quality.
I also saw Nvidia’s DLSS 4 demo in a Razer Blade 16 with RTX 5090 running Cyberpunk 2077 vs. the same laptop with an RTX 4090, and as reported, the FPS jump (88 vs 186 FPS) is big and real. But what’s exciting about that FPS boost is not just that the game runs smoother and faster, but that I didn’t notice any of the small graphical issues that I sometimes catch when using DLSS on older games.
I tried a few different demos during my time, but that FPS jump you get from Multi-Frame Generation stuck with me as the most impactful upgrade of the bunch. I also got to see examples of how the DLSS 4 renderer looks a bit nicer than the DLSS 3 version, how it handles reflections better with path traced lighting and how response times can improve with RTX Reflex enabled.
But the thing that stuck with me was what a difference Multi-Frame Generation can make in a game, at least in terms of raw frames per second.
For a long time I never recommended anything faster than a 144Hz or 160Hz display for all but the most hardcore speed freaks, because most people rarely top 120 FPS while gaming. But now that the RTX 50-series cards are nigh, I think we’re going to see more demand for displays capable of hitting 240Hz or faster.
Because when you’re able to easily hit framerates of 180+ FPS, getting a good gaming monitor or gaming laptop display with a high refresh rate and Nvidia’s G-Sync tech will be key to having the optimal experience.
Bottom line
Even though the performance of games I saw running on RTX 50-series cards impressed me this week, I have to say seeing them in person made me less interested in buying a 50-series GPU.
A big part of that has to do with the premium pricing on Nvidia hardware. I value frames per second when they’re 60 or below, but once games start climbing up to 120 FPS and higher I personally have a hard time telling the difference. It’s there, to be sure, but I can’t see it.
No, for me I just don’t think I saw enough of a noticeable difference in modern games to feel like spending upwards of $750 on a new graphics card.
Maybe down the road games will take better advantage of these features and the gap between RTX 40-series and 50-series performance will be more noticeable to my bad eyes, but in person I had to have the key improvements in DLSS 4 performance pointed out to be a professional to see them.
And right now I’m regularly reviewing gaming laptops and PCs with 40-series GPUs (like this Acer Predator Orion 5000) that cost around $2,000 and deliver killer performance in the best PC games on the market.
So while I have seen the future of PC gaming via Nvidia and stand impressed, I think the key selling points of the RTX 50-series cards make them a luxury for all but the most demanding games.
But while I expect prices of Nvidia GeForce RTX 50-series cards and laptops to be high for some time I don’t think it should dissuade you from buying one, especially if you plan to build a 4K gaming rig. If you want the best PC gaming experience possible, a 5070 Ti or better will go a long way towards making that happen.
Nvidia’s RTX 40-series cards support nearly all the features of DLSS 4 (save Multi-Frame Generation) and still run modern games quite well, so for my money I’m more excited to try and snag a discounted RTX 40-series card this year than to try and compete for a chance to blow hundreds of dollars more on performance I’ll barely notice.