By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
World of SoftwareWorld of SoftwareWorld of Software
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Search
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
Reading: I Tested the GeForce RTX 5090 for Laptops. It’s the New Fastest Mobile GPU
Share
Sign In
Notification Show More
Font ResizerAa
World of SoftwareWorld of Software
Font ResizerAa
  • Software
  • Mobile
  • Computing
  • Gadget
  • Gaming
  • Videos
Search
  • News
  • Software
  • Mobile
  • Computing
  • Gaming
  • Videos
  • More
    • Gadget
    • Web Stories
    • Trending
    • Press Release
Have an existing account? Sign In
Follow US
  • Privacy
  • Terms
  • Advertise
  • Contact
Copyright © All Rights Reserved. World of Software.
World of Software > News > I Tested the GeForce RTX 5090 for Laptops. It’s the New Fastest Mobile GPU
News

I Tested the GeForce RTX 5090 for Laptops. It’s the New Fastest Mobile GPU

News Room
Last updated: 2025/03/27 at 10:12 AM
News Room Published 27 March 2025
Share
SHARE

Following hot after their desktop-card counterparts, launched this January after their announcement at CES 2025, Nvidia’s GeForce RTX 50-series GPUs have officially arrived inside laptops. Mobile gamers patiently waiting to see what these chips can do while mulling over a new laptop purchase can finally get some definitive answers.

GeForce RTX 50 cards for desktops may be pricey and scarce at the moment, but laptops equipped with the mobile versions of these chips go on sale March 28. To carry out my first performance tests on this new platform, Nvidia loaned me a brand-new Razer Blade 16 laptop outfitted with a GeForce RTX 5090 GPU. This implementation combines high-end laptop hardware with top-end silicon, demonstrating the power potential of such top-of-the-line systems before more-affordable midrange and budget solutions emerge later. (The first RTX 50-series laptops to hit the street will have RTX 5090, RTX 5080, and RTX 5070 Ti GPUs, with RTX 5070s coming later.)

(Credit: Joseph Maldonado)

On the desktop side, the RTX 5090 is the tip-top card of Nvidia’s latest generation, so I was excited to put the mobile version through its paces and see what the RTX 50 series and its new “Blackwell” architecture can do for laptops. First, it’s important to explain what Blackwell brings to laptop graphics.


Nvidia ‘Blackwell’ Explained: RTX 50 GPUs Bring DLSS 4 to Laptops

We covered the initial announcement of RTX 50-series laptop GPUs during CES 2025, so while we don’t need a complete rehash of the platform here, I have several relevant aspects to touch on before getting into the testing.

On desktop and mobile, the RTX 50 series employs Blackwell-based graphics processors. Blackwell is Nvidia’s latest microarchitecture, which succeeds the RTX 40-series’ “Lovelace” design. Blackwell’s design includes fifth-generation machine-learning Tensor cores, fourth-generation ray-tracing (RT) cores, and newly added support for GDDR7 memory.

RTX 5090 Laptop GPU

The Nvidia GeForce RTX 5090 mobile GPU (Credit: Nvidia)

The platform is also more efficient, allowing for these improvements to be incorporated even into thin-and-light laptops. (Indeed, Nvidia sending over the always-thin Razer Blade as the test sample is something of a vote of confidence in Blackwell’s ability to perform in a slimmer machine.) Blackwell doubles down on Nvidia’s focus on AI-empowered GPUs with advancements in neural rendering rather than aiming for pure transistor count and raw horsepower increases.

This development is most relevant for gaming, manifesting primarily in DLSS 4 introduced with these GPUs, though you’ll find other related technologies at play during benchmarking that we’ll touch on, too. Deep Learning Super Sampling (DLSS) has been around for years at this point—read our DLSS explainer to catch up, if you need to—and we’ve already done a deep DLSS 4 performance dive on the RTX 50 desktop graphics cards to gauge its impact.

In a nutshell, DLSS (driven by the Tensor cores on local hardware) is a machine-learning technology that can do two things. First, DLSS can turn lower-resolution frames into higher-resolution images without rendering them on the GPU—instead, the work is done via algorithms using the original frame data. Second, the technology can apply similar algorithmic techniques to generate artificial frames between originally rendered frames to boost the effective number of frames displayed per second.

RTX 5090 Laptop GPU

The other side of an Nvidia GeForce RTX 5090 Laptop GPU (Credit: Nvidia)

DLSS 4 has become incredibly adept at the latter, thanks to Multi-Frame Generation (MFG) in supported games and Blackwell’s more efficient Tensor Cores. MFG can generate up to three additional frames for every traditionally rendered frame, significantly improving frame-rate counts. The new AI model also accomplishes this while running (according to Nvidia) 40% faster and using 30% less VRAM than the previous version.

This technology also works via a newer AI “Transformer” model, moving on from the preceding Convolutional Neural Network (CNN) design, which Nvidia believes has reached its limits. This more intelligent architecture generates higher-quality and more stable pixels via supplemental DLSS 4 features such as RTX Neural Shaders, Ray Reconstruction, and Super Resolution. The new hardware and software improvements together can generate up to 15 out of every 16 pixels if your goal is as many frames (rendered or generated) as possible. (For more on some of these backing technologies, check out our rundown of the GeForce RTX 50-series-related tech that Nvidia outlined at CES 2025.)


Schools of Thought on DLSS

Those are the more technical aspects of Blackwell and its new technologies; if you’re not as interested in this side of things, suffice it to say that DLSS 4 aims to improve gaming frame rates well beyond its predecessor while retaining as much picture quality and visual settings as possible.

These factors, though, will affect real-world performance and the benchmarking testing results in this article. I should acknowledge that, between resolution upscaling and artificially generated frames, gamers feel differently about whether these improvements really “count” as generational performance, not to mention whether gaining access to these artificial frame-rate improvements merits buying a new GPU. (Bearing in mind that the game in question has to actually support the DLSS flavor in question for you to make use of it.)

Razer Blade 16 (2025)

(Credit: Joseph Maldonado)

Other gamers, especially those on lower-end systems, will take any practical frame-rate improvement. Regardless of where your opinion falls, this technology can be key in getting high-fidelity games (or demanding features like path tracing) to run smoothly, so even if you see a quality trade-off, it may be worth it. I don’t have a philosophical opposition to these methods of attaining higher frame rates, but they’re not without caveats or downsides, which I’ll get into later.

You’ll find objective (rendered versus inserted frames) and subjective (how one feels about the image quality of DLSS upscaling) aspects to all of this, but the algebra is also a bit different on laptops versus desktops. You can’t un-bake a laptop GPU from the laptop itself, so to speak. You’re not buying the GPU itself; it’s only one part of a laptop’s total cost.

Additionally, unlike on a desktop, as laptop components wear down or age out, most of them are not replaceable, so you’ll need to buy a whole new system and, thus, a new GPU anyway. When you’re ready for a new machine, laptops with the most recent GPU generation are what you’ll find on the market.

Razer Blade 16 (2025)

(Credit: Joseph Maldonado)

Generation-over-generation improvements still matter to the enthusiast crowd, and I have those comparisons below. However, the reality for most shoppers is a bit different. Nvidia shared the stat that 70% of laptop owners are currently running systems with the RTX 30 series or older. The question of whether 40-series owners should upgrade to the latest generation is only a tiny piece of the picture. I’ll look at both the generational improvement and the concrete results the RTX 50 series can achieve regardless of what GPU you’re currently running.


GeForce RTX 5090 Performance Testing: The New Graphics King Arrives

Again, Nvidia sent over the slick Razer Blade 16 with an RTX 5090 GPU to run our first tests of this new generation. Currently, Nvidia has announced the RTX 5090, RTX 5080, RTX 5070 Ti, and RTX 5070 mobile GPUs in descending power order. As mentioned, this first wave of laptops will include the three higher-tier GPUs, with RTX 5070 laptops coming a bit later in 2025.

For those with smaller budgets, benchmark results on the rest of these 50-series GPUs will take longer to publish since the RTX 5090 and 5080 arrived first.

As for the Blade 16 in particular, this laptop line is always premium, so even Razer’s “least” expensive RTX 5070 Ti model starts at $2,999.99 with an AMD Ryzen AI 9 365 processor, 32GB of memory, and a 1TB SSD. Upgrading to the RTX 5080 costs $3,499.99 and keeps the other specs the same (though you’re free to upgrade RAM and storage up to 64GB and 4TB, too).

Our First Tested RTX 5090 Laptop…

Razer Blade 16 (2025)

Razer Blade 16 (2025)

Choosing the RTX 5090 configuration I’ve tested here forces a bump to the Ryzen AI 9 HX 370 CPU and a double-up on storage to a 2TB SSD, all for a whopping $4,499.99. All models include a 240Hz QHD+ resolution (2,560-by-1,600-pixel) display. All models of other RTX 5090 laptops I’ve seen available for order before launch are above $4,000, too; some may drop below that mark, and other RTX 50-series GPUs will be in less expensive systems. But the RTX 5090 will almost always pair with top-end parts and features because of its own costly nature and high throughput.

Razer Blade 16 (2025)

(Credit: Joseph Maldonado)

The new Blade I have in hand runs at 135 watts TGP (plus 25W of Dynamic Boost, for 160W total) on Razer’s Performance Mode, which I tested. Below, you’ll note the new GDDR7 VRAM debuting, versus the RTX 4090’s GDDR6 memory, plus the RTX 5090’s increased memory allotment (24GB).

To test Nvidia’s latest top-end GPU, I put it through our usual graphics and gaming benchmark suite and followed up with some anecdotal testing to account for the new features. Here are the systems I’ll compare the results against…

These are ordered by relevance as comparisons to the Blade 16 and its RTX 5090, starting with the 2024 Razer Blade with its RTX 4090—a near-perfect generational battle. Next, I’ve included the hulking MSI Titan 18 HX for a best-case-scenario for the RTX 4090 (this is a thick, super-powered laptop with a load of thermal headroom) and then a slightly older (13th Gen) Razer Blade 18 with an RTX 4080. Finally, I also included the Asus ROG Zephyrus G16 to show how an RTX 4070 stacks up against the higher-end chips in other thin 16-inch systems.

Most of our tested gaming laptops are Intel-based, so the processor matchups aren’t 1:1 (except for the Zephyrus), but these are all top-of-the-line, performant chips. Additionally, while I wish we could directly compare RTX 3090 performance, we have long since updated our benchmark suite from when we were testing RTX 30 Series laptops, and I don’t still have any on hand to re-test the new games on. Those looking to upgrade from the 30 series or older must judge the RTX 5090’s performance on its own terms to see if the frame rates are enough to make the jump.

Graphics and Gaming Tests

Here are the benchmark results, with descriptions of each test preceding them—first, our usual graphics test suite for laptop reviews.

We challenge all test laptops’ graphics with a quartet of animations or gaming simulations from UL’s 3DMark test suite. Wild Life (1440p) and Wild Life Extreme (4K) use the Vulkan graphics API to measure GPU speeds. Steel Nomad’s regular and Light subtests focus on APIs more commonly used for game development, like Metal and DirectX 12, to assess gaming geometry and particle effects. We also turn to 3DMark’s Solar Bay to measure ray tracing performance in a synthetic environment. This benchmark works with native APIs, subjecting 3D scenes to increasingly intense ray-traced workloads at 1440p.

Our real-world gaming testing comes from in-game benchmarks within Call of Duty: Modern Warfare 3, Cyberpunk 2077, and F1 2024. These three games—all benchmarked at the system’s full HD (1080p or 1200p) resolution—represent competitive shooter, open-world, and simulation games, respectively. If the screen is capable of a higher resolution, we rerun the tests at the QHD equivalent of 1440p or 1600p. Each game runs at two sets of graphics settings per resolution for up to four runs total on each game.

We run the Call of Duty benchmark at the Minimum graphics preset—aimed at maximizing frame rates to test display refresh rates—and again at the Extreme preset. Our Cyberpunk 2077 test settings aim to push PCs fully, so we run it on the Ultra graphics preset and again at the all-out Ray Tracing Overdrive preset without DLSS or FSR. Finally, F1 represents our DLSS effectiveness (or FSR on AMD systems) test, demonstrating a GPU’s capacity for frame-boosting upscaling technologies.

Starting with the synthetic 3DMark tests, the RTX 5090 comes off well. For those concerned about a lack of raw power gains without DLSS active, the RTX 5090 scored the highest on almost all of these tests, even over the much larger RTX 4090-bearing Titan 18 HX. The gains are moderate, but all considered, more performance is more performance—especially in a thin 16-inch laptop.

This trend mostly continues into real game testing. I’ll focus on percentage gains over the RTX 4090 since the actual frame rates vary widely, and that’s the most relevant GPU comparison. Looking at Cyberpunk 2077 at 1200p, the RTX 5090’s 109fps and 41fps results were a 6% and 11% increase over the RTX 4090 on Ultra and Ray Tracing Overdrive settings, respectively. Cranked up to native 1600p resolution, the RTX 5090 saw a 16% and 4% increase on these same runs.

Recommended by Our Editors

On Modern Warfare 3 at 1200p, the RTX 5090’s 225fps score was 4% lower than the RTX 4090’s Minimum settings run, just about within the margin of error, while 173fps was a 9% increase on the RTX 4090’s Extreme settings run. At 1600p, the improvements grouped tighter regardless of visual setting: The RTX 5090 scored about 8.5% higher than the RTX 4090 on Minimum settings and 8% higher on Extreme settings.

The F1 2024 results are not as cut and dry, with some CPU-bound limitations of this system capping the efficacy of DLSS on its Ultra performance setting. Other processors may be able to open this up further, as we see on the other systems, but the RTX 5090 is still pushing higher traditionally rendered frame rates at both resolutions than the RTX 4090 on this game, too.

Additional Upscaling and Frame Generation Tests

Outside our usual test suite, I ran additional DLSS tests focusing on Frame Generation on the Razer machine. These are all run at 1600p, and the higher visual setting preset from the previous batch of tests; the first result for each game here is carried over from those tests to serve as the new baseline.

Pay special attention to Cyberpunk 2077, as it’s the only game here that supports DLSS 4 and Multi Frame Generation. You can find more than 100 titles that support DLSS 4 as of now, but few with a built-in benchmark test, and Cyberpunk is already our go-to system crusher of choice. The other two games here show how the RTX 5090 performs with the previous version of DLSS (not Multi-Frame Generation), which will be the reality for many games for the time being.

As we saw before, Cyberpunk is practically unplayable at 1600p and maximum settings with no DLSS active. The upscaling technology saves the day here, more than tripling the score to 77fps. Despite some qualms about artifacts or ghosting—of which I saw a little, but less than previous DLSS generations—at the end of the day, DLSS turns the game from sub-30fps to comfortably above 60fps.

That’s only half the story, though. Turning on Frame Generation (at “2X” in Cyberpunk’s settings) nearly doubles the score again to 147fps—roughly expected given how the technology works. Finally, Multi Frame Generation set to “4X” increases the Frame Generation result by about 73%.


Nvidia Battery Boost and Off-Plug Battery Life

One last area I tested was battery life, using Nvidia’s new Battery Boost feature. Activated in the Nvidia App, this feature dynamically balances your CPU and GPU output while monitoring battery discharge when gaming off the charger.

While playing Avowed—another DLSS 4 game with a big open world and high visual fidelity—on High visual settings with DLSS active, the Razer Blade 16 ran for almost exactly an hour and a half off the plug. This was in sustained, active gameplay, with nary a pause. Most of us would be hard-pressed to find ourselves often in a situation where we could game for that long, uninterrupted and away from a power outlet.

I also ran the Blade 16 through our usual battery life test (a full battery rundown while playing a local video file at 50% brightness with airplane mode active until the laptop dies), and it ran for 10 hours and 12 minutes, so you can definitely expect much more when not playing a demanding game.


The Takeaway: RTX 5090 Makes DLSS 4 Dazzle

On the high end, those DLSS and Frame Generation results are fantastic for letting you keep all the visual details active with minimal downsides; extrapolate that to lower-end hardware, and unplayable games become playable on a budget. Frame Generation isn’t perfect, and some gamers may be wary of the input lag or latency in competitive multiplayer games in particular—only you can say how much it bothers you. Others will be glad a given demanding game runs at all, or at high frame rates, on their systems. DLSS 4 is especially useful to gamers who would rather run games at maximum settings and extra-high resolutions, and less so for 1080p gamers.

From experience, I think the visual clarity is worth the trade-off—DLSS 4 looks sharp and cuts down on the visual artifacts and fuzziness that I’ve seen from past editions. Playing Avowed outside of our testing suite and tweaking different visual settings, I found some detectable differences in the look and feel, but they were pretty minor, and the game looked crisp even with DLSS running. You can mitigate any detectable latency via the settings, and if I felt like trying the game with Frame Generation off, it wasn’t hard to find a playable frame rate with standard DLSS active.

Razer Blade 16 (2025)

(Credit: Joseph Maldonado)

The other option is dialing down the visual settings; I don’t need to play Cyberpunk 2077 with path tracing active, but it looks slick, and DLSS and/or Frame Generation make it possible while running smoothly. You’ll also find levels of granularity for the DLSS level of quality versus performance—it’s all a trade-off.

I’m personally not dismissive of the DLSS upscaling gains as in some way false, as they demonstrably improve the experience. At the same time, pure hardware gains these days are producing diminishing returns. I’m more sympathetic to concerns over Frame Generation, as I recognize the potential perceived-latency issue in some game genres, which you can try to combat with Nvidia Reflex; this feature can reduce system latency in games.

We’ll have many more months and years with the RTX 50 series to dig deeper into the results, upsides, and trade-offs versus my relatively short time with the RTX 5090 so far. I look forward to testing the other GPUs in the lineup and exploring more scenarios and features than I could here. Check back soon for full RTX 50-series laptop reviews and additional testing. (Indeed, we’ve got an RTX 5080 machine on the bench right now.)

Newsletter Icon

Get Our Best Stories!

Sign up for What’s New Now for the top tech news of the day delivered to your inbox every morning.

This newsletter may contain advertising, deals, or affiliate links.
By clicking the button, you confirm you are 16+ and agree to our
Terms of Use and
Privacy Policy.
You may unsubscribe from the newsletters at any time.

Newsletter Pointer

About Matthew Buzzi

Lead Analyst, Hardware

Matthew Buzzi

I’m one of the consumer PC experts at PCMag, with a particular love for PC gaming. I’ve played games on my computer for as long as I can remember, which eventually (as it does for many) led me to building and upgrading my own desktop. Through my years here, I’ve tested and reviewed many, many dozens of laptops and desktops, and I am always happy to recommend a PC for your needs and budget.

Read Matthew’s full bio

Read the latest from Matthew Buzzi

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share This Article
Facebook Twitter Email Print
Share
What do you think?
Love0
Sad0
Happy0
Sleepy0
Angry0
Dead0
Wink0
Previous Article iPhone 17 just tipped for this major display upgrade — thanks to Samsung
Next Article Vizio’s Latest All-in-One Soundbar Sounds Great for Less
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Stay Connected

248.1k Like
69.1k Follow
134k Pin
54.3k Follow

Latest News

Apple Eyeing Using A to Design Its Chips, Technology Executive Says
Software
We’ve Tested Dozens of OLEDs to Find the Best
Gadget
How Brad Pitt’s F1 movie helped Apple create the iPhone 15 Pro camera
News
University of Oulu shows machine vision can replace expert presence | Computer Weekly
News

You Might also Like

News

How Brad Pitt’s F1 movie helped Apple create the iPhone 15 Pro camera

5 Min Read
News

University of Oulu shows machine vision can replace expert presence | Computer Weekly

4 Min Read

One Tech Tip: No more lost cats and dogs. Use tech to track your pet

9 Min Read
News

SpaceX rocket explodes during test

2 Min Read
//

World of Software is your one-stop website for the latest tech news and updates, follow us now to get the news that matters to you.

Quick Link

  • Privacy Policy
  • Terms of use
  • Advertise
  • Contact

Topics

  • Computing
  • Software
  • Press Release
  • Trending

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!

World of SoftwareWorld of Software
Follow US
Copyright © All Rights Reserved. World of Software.
Welcome Back!

Sign in to your account

Lost your password?