• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[DF]Cyberpunk 2077 Ray Tracing: Overdrive Technology Preview on RTX 4090

Spyxos

Member


Path tracing on a triple-A videogame? On THE triple-A videogame? Cyberpunk 2077 already pushed visual boundaries with its psycho ray tracing mode, but the RT Overdrive upgrade takes lighting to the next level. So how does the Technology Preview compare to the standard rasterised version of the game? And how are the visuals improved over the already impressive psycho RT mode? This explainer should fill you in - and suffice it to say we've got much more content on the way as we spend more time with the game.

Ray Tracing: Overdrive looks incredible, but is currently only usable with an RTX 4090 at 4k and even then the frames drop below 20 fps. It only becomes playable with DLSS2 and DLSS3 frame generation.

Edit: Runs relatively well even on older 3xxx cards.



Optimization for older cards.
 
Last edited:
Under 20 fps with a 4090 without frame generation

Desus And Mero Pass GIF by Bernie Sanders
 

The Cockatrice

I'm retarded?
So if he’s getting 90 fps with dlss set to performance, I wonder if I could get at least 60 with it set to quality on a 4099

You can use DLSS3 and DLSS2 to gain even more performance. I dont think the performance is that heavy because for starters, there are also a ton of non RT settings that you can lower that offer 0 visual benefits but consume at around 20-30FPS.
 

Killer8

Member
Under 20 fps with a 4090 without frame generation

Desus And Mero Pass GIF by Bernie Sanders

DLSS2 doesn't use frame generation and it's still giving about 59fps according to the video. And that's at 4K. I can see lower spec cards being able to do this at 1080p. Perhaps even 1440p at 30fps if we're lucky. Honestly this is surprisingly good performance for fully ray-traced lighting.

I understand how impressive the tech is.
But I don't want every game to look like that.
Sometimes I want the "video game" look.

Japanese games are what you want. Some of them haven't even heard of anti-aliasing yet.
 

Umbasaborne

Banned
DLSS2 doesn't use frame generation and it's still giving about 59fps according to the video. And that's at 4K. I can see lower spec cards being able to do this at 1080p. Perhaps even 1440p at 30fps if we're lucky. Honestly this is surprisingly good performance for fully ray-traced lighting.



Japanese games are what you want. Some of them haven't even heard of anti-aliasing yet.
Game freak is still struggling to reach 2006 ps3 levels of quality
 

hlm666

Member
I understand how impressive the tech is.
But I don't want every game to look like that.
Sometimes I want the "video game" look.
It still works for other stylized looks, fortnite with lumen for example. Think of how it's used in movies to make realistic cgi but also used to make the none realistic things like pixar stuff. Love Death Robots series has examples of both extremes aswell.
 

Tripolygon

Banned
DLSS2 doesn't use frame generation and it's still giving about 59fps according to the video. And that's at 4K. I can see lower spec cards being able to do this at 1080p. Perhaps even 1440p at 30fps if we're lucky. Honestly this is surprisingly good performance for fully ray-traced lighting.
DLSS Performance mode at 4K is 1080p internally on a 4090. Any lower and you'll be rendering less than 720p on a lower end card and still not hitting 30fps.
 

DonkeyPunchJr

World’s Biggest Weeb
It still works for other stylized looks, fortnite with lumen for example. Think of how it's used in movies to make realistic cgi but also used to make the none realistic things like pixar stuff. Love Death Robots series has examples of both extremes aswell.
Yup. IIRC Cars was the first CGI movie to use raytracing and Monsters University was the first to use full path traced global illumination.
 

Mister Wolf

Member
Overdrive is asking a bit too much for my taste. I'm only willing to use frame generation if my base framerate exceeds 60fps. The standard raytraced lighting(Psycho) is already better 99% of other videogames lighting tech.
 

Abriael_GN

RSI Employee of the Year


Godot creator on ratraced gi. I mean kind of agree, gi is important but I dont think its the holy grail of rendering like some claim.


"most people have no clue what it changes."

You don't need to actively perceive what has changed for things to look considerably better, and what DF has shown looks considerably better.

But hey, if the developer of a small engine almost no one uses tells you it's not relevant, it must not be relevant. It's *totally* not a fox and grapes situation. 😂
 
Last edited:

Killer8

Member
DLSS Performance mode at 4K is 1080p internally on a 4090. Any lower and you'll be rendering less than 720p on a lower end card and still not hitting 30fps.

That's fine though. DLSS still works very well at lower target resolutions like 1080p and 1440p. 720p internal can be used as a Quality base for 1080p and as the Performance base for 1440p. If a person has a mid-range card like a 3060 Ti, they will have to accept these resolution sacrifices, but the results should still look good and be playable.
 

Hugare

Member


Godot creator on ratraced gi. I mean kind of agree, gi is important but I dont think its the holy grail of rendering like some claim.

Animated GIF


Had to google "Godot"

"For one, it makes small to no difference in exteriors"

Yeah, I'm not listening to this guy

Saying it makes small to no difference to exteriors is just wrong

I mean, just watch the video. In open world games without a RT GI sollution you will always have those spaces with "glowing" stuff. Always.

Show me one open world game (that doesnt have baked lighting) that doesnt have this problem

RT GI is the holy grail. You dont have to individualy lit places, or bake lighting to make it look correct. "It just works".
 

Bo_Hazem

Banned
Under 20 fps with a 4090 without frame generation

Desus And Mero Pass GIF by Bernie Sanders

It's a great look at the future. I think we need to reach 1nm to be able to run RT properly with large amount of computation or make smarter solutions like Lumin on UE5. And of course, I'm a futurist, so next gen gonna be the best, like always until the next one comes.

But I think at this point ARM is a must. Having all the memory (RAM/VRAM), CPU, GPU, caches, RT cores in one large chip would 100% make less latency between those parts and better results overall. Excited to see what nVidia will do with their ARM progress, same with AMD and Intel. Apple is a lost cause.
 
Last edited:

The Cockatrice

I'm retarded?


Godot creator on ratraced gi. I mean kind of agree, gi is important but I dont think its the holy grail of rendering like some claim.


It's not unless the game is made from scratch with RTGI or full PT in mind. All those Non-rt vs rt vs pt changes could've easily been fixed manually by adding fake shadows and fake lights to all sources. It would take a lot of effort compared to RT/PT but its very much possible to make your games look exactly as if it was Raytraced., at least easier for smaller games or non-open world games. Take Atomic Heart for example. That games baked lighting looks insane.
 

Buggy Loop

Member
Nvidia solved path tracing at large scale

I'm not sure peoples understand here

It might be tough to run (duh) as of now, but it's SOLVED.

We're talking that most likely 5000 series from there on will have path tracing rendering pipeline as default. It's like the first wave of tesselation in games and now, nobody thinks "oh god, bye bye fps" anymore.

"bu but mah old card can't run it!"

No fucking shit.

"bu but most peoples can't tell the difference"

And we would still be stuck in caves trying to make fire if we listened to every fucking brainlet. Probably same peoples who plug component cables in their 4k OLED.

Nvidia's ReSTIR is like a decade ahead of what i thought we would see for path tracing in full fledge games. Fucking witchcraft.
 
Last edited:

acm2000

Member
next gen has finally arrived, and it only costs $3000.

two gens from now this shit might be affordable tho and i cant wait
 

analog_future

Resident Crybaby
Looks absolutely amazing. Something I can look forward to in 6-7 years when the next Xbox/Playstation consoles arrive.
 
Last edited:

ToTTenTranz

Banned
We all watched the video. Some scenes have little differences while others are far more drastic. No need for selective screenshots.


Point is - and always was - that Cyberpunk's problems lie in the poorly detailed models (and other assets like the water), and that doesn't get solved by path tracing.
This is just an over-polished turd.

Using full path tracing is an interesting concept that had already been shown with Minecraft and Quake II.
There's just not enough performance at the consumer level (where the 4090 doesn't belong) to justify the investment instead of polishing the assets in 2023. Nor will there be as long as the RT implementations are as stiff as they are with DXR 1.1.


Cyberpunk could could have been a spectacular-looking game at this point, had CDPR been focused on making it look good instead of making Nvidia advertisements for prosumer cards.

I hope they're at least getting paid by Nvidia to do this. Both them and the triggered dudes in this thread making endless screenshot dumps that further prove my point.
 
Last edited:
Top Bottom