Early DLSS 4 test showcases cleaner images and multiplied framerates

Daniel Sims

Posts: 1,725   +48
Staff
Through the looking glass: Users are eagerly waiting to see whether real-world performance tests back up the big promises Nvidia made when presenting its latest AI-powered game rendering technology at CES. Although those reviews are likely weeks away, Digital Foundry has provided an early third-party trial showcasing how DLSS 4 eliminates upscaling glitches and adds new AI-generated frames.

Eurogamer has released a preliminary demonstration of Nvidia's newly announced DLSS 4 and multi-frame rendering functionality. The results indicate definitive improvements over what's currently available on RTX graphics cards, but the full extent of the costs involved remains unclear.

Multi-frame rendering, exclusive to the upcoming RTX 50 series GPUs, generates additional frames to multiply the effect of the frame generation technology Nvidia introduced with the 40 series. By injecting two or three additional frames, multi-frame rendering can triple or quadruple a game's perceived framerate.

However, frame generation adds input latency, and users are concerned that multi-frame rendering could compound the problem. Digital Foundry's test shows that, while the added frames aren't free, they cost less than the initial application of frame generation.

In TechSpot's benchmark from 2022, introducing frame generation to Cyberpunk 2077 in DLSS performance mode added approximately 10 milliseconds of lag. However, the accompanying 60 percent increase in framerate easily outweighed the delay.

Click to enlarge

Digital Foundry ran an updated build of the game with similar DLSS settings on an engineering sample of the upcoming GeForce RTX 5080 and received a smaller penalty from multi-frame rendering. On top of the prior added latency, the second AI-generated frame introduced around four milliseconds, while the third frame added about two – the price for a framerate increase totaling 71 percent.

Although Eurogamer didn't discuss how good those "fake" frames looked, multi-frame generation appears to offer a better deal than currently available frame generation tech. Meanwhile, DLSS 4's other changes promise substantially better image quality.

Transitioning from a convolutional neural network to a vision transformer likely provides the biggest improvement to DLSS super resolution since Nvidia introduced DLSS 2 in 2020. Flaws such as smearing, ghosting, and shimmering are now far less apparent. Ray reconstruction is also notably improved.

Click to enlarge

The vision transformer is likely DLSS 4's most significant feature because it supports all RTX graphics cards. Furthermore, the Nvidia app will allow users to update older games without waiting for patches from developers.

What isn't clear is the performance cost. Future benchmarks should extensively test which RTX 20 and 30 series GPUs can handle DLSS 4. Furthermore, the VRAM impact of multi-frame rendering remains undisclosed. The RTX 5070's ability to implement the new feature with just 12 GB of memory will determine whether it lives up to Nvidia CEO Jensen Huang's claims that it can match the flagship 4090.

Permalink to story:

 
Well, since this dead horse isn't going to beat itself, I think it's necessary to say that nVidia features don't work on Linux natively and requires a lot of jank to get working. FSR does, though. While jank is something anyone planning on moving to Linux needs to accept, it's nice to be able to cut out as much of it as possible.

So for anyone thinking of trying to go all in on linux when SteamOS gets released, maybe hold off on your GPU purchase.
 
In TechSpot's benchmark from 2022, introducing frame generation to Cyberpunk 2077 in DLSS performance mode added approximately 10 milliseconds of lag.

It is +10ms with DLSS performance & fake frames or +22% lag.
It is +16ms with DLSS quality & fake frames or +33% lag.

33% more lag is already bad. The fact that more fake frames "only" increases the lag to +41% and +46% is not that exciting.

However, the accompanying 60 percent increase in framerate easily outweighed the delay.

No, it does not. The whole reason you want 112 frames instead of 72 (in the article example) is so that the gameplay feels smoother not looks smoother. My preferences of 90 FPS good and 120 FPS better are about the added responsiveness. It's not like 72 FPS looks choppy (like 20 FPS would).
 
DLSS 4 is pure magic. It’s shaping up to be the next big thing for gaming visuals and GPUs. Smoother images, better ray tracing, INSANE framerate boosts, and somehow, it costs less in latency than before. Pure magic.

The new vision transformer tech also sounds like a total game-changer, fixing ghosting and shimmering. Plus, as a bonus, it works on all RTX cards, not just the latest ones.

That said, real reviews aren’t out yet, so all of this is based on Nvidia’s own demos and materials. If it holds up, this could be the closest we’ve ever gotten to next-gen visuals on today’s hardware. No need to wait years for brute-force GPUs. Nvidia just leapfrogged the game here. Honestly, I think raster performance might stop being the main focus now (like the 25-30% bump in FC6 from their slides). Can’t wait to see the reviews.

P/S: In b4 “paid by Nvidia.” These are just my opinions. If you don’t like it, nobody is forcing you to like it or even buy these RTX Blackwell GPUs.
 
DLSS 4 is pure magic. It’s shaping up to be the next big thing for gaming visuals and GPUs. Smoother images, better ray tracing, INSANE framerate boosts, and somehow, it costs less in latency than before. Pure magic.
It doesn't cost less in latency. It adds even more latency. It just adds less than 2X or 3X latency, which incidentally is an arbitrary comparison number as any programmer wouldn't expect doing almost the same thing again to scale so poorly.
 
It doesn't cost less in latency. It adds even more latency. It just adds less than 2X or 3X latency, which incidentally is an arbitrary comparison number as any programmer wouldn't expect doing almost the same thing again to scale so poorly.

Yes, I meant it's mighty impressive how Nvidia managed to keep the added latency relatively low compared to what could’ve been a much bigger hit. So yeah, not “less” overall, but less than you’d think given the massive performance boost.
 
Yes, I meant it's mighty impressive how Nvidia managed to keep the added latency relatively low compared to what could’ve been a much bigger hit. So yeah, not “less” overall, but less than you’d think given the massive performance boost.
If they didn't use reflex on the promotional slides in the comparison hardware but used it on for the promotional Blackwell hardware just to make it seem better then what are we really excited about. My bs threshold might be more sensitive than yours.
I typically don't give Nvidia the benefit of doubt at this point.
 
If they didn't use reflex on the promotional slides in the comparison hardware but used it on for the promotional Blackwell hardware just to make it seem better then what are we really excited about. My bs threshold might be more sensitive than yours.
I typically don't give Nvidia the benefit of doubt at this point.
The reflex thing could eventually be made to be perfectly capable of playing the games for you. On any difficulty, with minimal or no input from the user. And it would look gorgeous and lifelike.
Yawn. I’d rather read a book.
 
"Cyberpunk 2077 in DLSS performance mode added approximately 10 milliseconds of lag. However, the accompanying 60 percent increase in framerate easily outweighed the delay"
Depends on the game and who is playing it.
Competitive gamers are not excited about more latency.

ESports games don't need to use DLSS. They're all relatively light games to run, giving FPS in the hundreds without the need for upscaling and frame gen.
 
I’m sorry to hear that. Nvidia makes enough money, they could afford slipping a few bucks here and there.
Aww, that's so sweet.
Don’t worry, you can still enjoy AMD’s stunning attempts to catch up while the rest of us marvel at what Nvidia’s cooking for the future.
Embrace the advancements, it’s 2025, not 2015.
I heard the "incredible" RX 9070 XT will try to beat... the 4080.
 
Yes, I meant it's mighty impressive how Nvidia managed to keep the added latency relatively low compared to what could’ve been a much bigger hit. So yeah, not “less” overall, but less than you’d think given the massive performance boost.
It's not a performance boost. It's the equivalent of changing your speedometer from miles to kilometers and thinking you are going faster because the number is bigger.
 
Aww, that's so sweet.
Don’t worry, you can still enjoy AMD’s stunning attempts to catch up while the rest of us marvel at what Nvidia’s cooking for the future.
Embrace the advancements, it’s 2025, not 2015.
I heard the "incredible" RX 9070 XT will try to beat... the 4080.
It's funny because NVIDIA basically abandoned the 4090 owners with DLSS4. It's 50 series only. So when I see people talking about how great NV software features are I instantly think about people who enjoy being ripped off. And all of those games that use DLSS2 or 3 that are no longer in active development, they're still going to be using DLSS2 or 3 on your 50 series cards.

what makes it funnier is that 4090 owners are going be using FSR4 instead of DLSS4 since AMD is making it hardware agnostic. Hacked drivers have shown that DLSS is also hardware agnostic, it's just soft locked by nVidia
 
Well, since this dead horse isn't going to beat itself, I think it's necessary to say that nVidia features don't work on Linux natively and requires a lot of jank to get working. FSR does, though. While jank is something anyone planning on moving to Linux needs to accept, it's nice to be able to cut out as much of it as possible.

So for anyone thinking of trying to go all in on linux when SteamOS gets released, maybe hold off on your GPU purchase.
If you use Bottles, it's a single on/off toggle to use DLSS.
 
It is +10ms with DLSS performance & fake frames or +22% lag.
It is +16ms with DLSS quality & fake frames or +33% lag.

33% more lag is already bad. The fact that more fake frames "only" increases the lag to +41% and +46% is not that exciting.



No, it does not. The whole reason you want 112 frames instead of 72 (in the article example) is so that the gameplay feels smoother not looks smoother. My preferences of 90 FPS good and 120 FPS better are about the added responsiveness. It's not like 72 FPS looks choppy (like 20 FPS would).

You are hitting the nail on the head here. TBH, with steady frame pacing, 30 fps doesn't even look that bad. People watch movies at 24fps or perhaps 48fps in some cases. The motion is fluid enough. Sure, if you are panning the screen quickly or playing a game in a first - person perspective, where the whole scene is often moving quickly a higher frame rate is a must, but 60fps is good enough for most gamers even in those scenarios. What competitive gamers want is lower overall latency. They buy 1ms monitors, wired mice and keyboards also with <1ms delays and 1000hz polling rates. They want 240+ fps to reduce latency as much as possible to give them an advantage.

I'm more of a single player/action-rpg etc gamer. I'm very good with a solid 60 fps in the majority of games I play. I don't need 90-120 fps, but I'll take it when I can get it if I'm not sacrificing too much quality. However, what I would never do is turn of FG to get from 60 fps to 100 fps because it feels laggy. You are used to a certain level of response from a game and FG makes it feel like you are using a first generation bluetooth controller where you notice a difference between your button press and the action on screen.

Some people really seem to like FG, but I don't really understand it. I guess it's not so bad in some action games where some sloppiness in your button presses makes little difference, and there are games like that. I have tried FG is Starfield, Wukong, A. Wake 2, Jedi Survivor, and other titles, I almost always try the "recommended" settings first. Every time I turn it off after a few minutes. I can immediately feel it when it's on. Not to mention it still has major artifacts and other issues.
 
Aww, that's so sweet.
Don’t worry, you can still enjoy AMD’s stunning attempts to catch up while the rest of us marvel at what Nvidia’s cooking for the future.
Embrace the advancements, it’s 2025, not 2015.
I heard the "incredible" RX 9070 XT will try to beat... the 4080.
What I’m actually enjoying is, indeed, my old 1080, the last decent card nvidia ever made, which is still serving me well in my gaming. And the occasional book. And desperate cheerleader free performances in public, they are more entertaining than anything else. Apologies for disturbing you! Please, continue!
 
Last edited:
If you use Bottles, it's a single on/off toggle to use DLSS.
while that's true, the JANK that I was talking about is with how you get bottles to work. It's a process and there are no promises it will work. But, yes, if you get bottles to work then it is as simple as on/off.

You CAN get this stuff to work, but none of it is officially supported and it's on you to get it to work. It also requires some not-basic Linux skills to get working and maintain.

You can get anything to work on Linux, the only barrier is how much knowledge you have and how much time you're willing to sacrifice to get it to work. If you want a plug and play experience, go AMD. But I caution people strongly to not buy a 5090 and get DLSS4 to work just because Bottles is a thing. If you're main concern is moving from windows to Linux, an AMD setup is almost mandatory for as many native and open source drivers as possible.
 
Last edited:
Didn’t realize most people want a GPU that’s 5 slots wide, chugs 1000W, and doubles as a central heating system for the entire house. And here I was, thinking Nvidia, the leader in AI, should actually keep pushing their AI-driven tech like DLSS forward. Nope, my bad. Totally missed the memo. Perhaps AMD has a better plan.
 
I heard the "incredible" RX 9070 XT will try to beat... the 4080.
It doesn’t need to beat it, just get within 10-15% from it, for the right price (which is something AMD has a proven record of botching). If (and that’s a big if) AMD pulls it off, it will fly off the shelves.
And that would be good. We need to root for more, not less competition, especially in the midrange performance area, where most of people are spending their money.
 
Didn’t realize most people want a GPU that’s 5 slots wide, chugs 1000W, and doubles as a central heating system for the entire house. And here I was, thinking Nvidia, the leader in AI, should actually keep pushing their AI-driven tech like DLSS forward. Nope, my bad. Totally missed the memo. Perhaps AMD has a better plan.
I bet if Nvidia made moderately hungry and much cooler cards, AMD would not have a problem beating them. This is the thing, they are offering the very best they are able to make within the physical limits.
This is the thing they leave to buyers. Do not like heat? Downvolt.
 
"Cyberpunk 2077 in DLSS performance mode added approximately 10 milliseconds of lag. However, the accompanying 60 percent increase in framerate easily outweighed the delay"
Depends on the game and who is playing it.
Competitive gamers are not excited about more latency.
Very few competative gamers play in any form of high fidelity mode - and thus won't require any form of frame generation. Usual settings for professional gamers: - every setting but distance rendering is set to minimum to achieve the maximum amount of frames and minimum amount of clutter in their POV.
Not saying I love a 14ms delay on my inputs, as I think you're close to, or passing the point where you can notice it in any type of game with rapid movements. Current mouse and keyboards are down to around 1ms input delay, which means you would add a whopping 1500% extra input delay with 3 fake frames before you're taking into account that software itself has input delay based on which games you are playing. I hope you still get to choose between 1-3 frames in the game settings or nvidia control panel
 
Back
  翻译: