• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD RDNA 4 GPUs To Incorporate Brand New Ray Tracing Engine, Vastly Different Than RDNA 3

If you don't care about rtx and you shouldn't to be honest. It's a con that Nvidia has hussled. Amd cards seem to deliver on par power in none ray tracing for a cheaper price.

Unfortunately for me I have a g sync monitor. When I do my new build in a couple of years I'll be going to a 4k monitor up from 1440p. Free sync seems to be the better option.
 

Bojji

Member
If you don't care about rtx and you shouldn't to be honest. It's a con that Nvidia has hussled. Amd cards seem to deliver on par power in none ray tracing for a cheaper price.

Unfortunately for me I have a g sync monitor. When I do my new build in a couple of years I'll be going to a 4k monitor up from 1440p. Free sync seems to be the better option.

Fixes problems with raster graphics = con, lol. It also does things (global illumination) that raster can't do.

Not to mention DLSS is much better than FSR 3.1 (we will see how 4 will compare).

AMD is behind in every metric, in RT they are even behind Intel (so far).

Only thing that AMD does better is (usually) vram amount at same performance tiers than nvidia.

I HOPE AMD will deliver and fix things but they usually fuck up on GPU front.
 

Crayon

Member
I get the argument that only heavy RT games are the ones that matter. And I do think those should be looked at separately. But to me, if I'm going to say only one kind of RT matters, it's required rt. Only a few games require it, and those have been lightweight implementations.
 

Gaiff

SBI’s Resident Gaslighter
If you don't care about rtx and you shouldn't to be honest. It's a con that Nvidia has hussled. Amd cards seem to deliver on par power in none ray tracing for a cheaper price.

Unfortunately for me I have a g sync monitor. When I do my new build in a couple of years I'll be going to a 4k monitor up from 1440p. Free sync seems to be the better option.
Then there’s DLSS and Reflex.
 

Crayon

Member
Only one for now is indiana, and big question about ff7 part 2

Does avatar? Not sure about that one. I know there's the edition of metro exodus that requires rt. So hardly anything, but games will start to require it eventually and the minimum requirement will be light.
 

SpokkX

Member
They are not generations ahead. Look to 7900 xtx in raster beats 4080. Yes dlss is better and rt too. But besides xx80 or xx90 rt is just a waste. In some games even xx80 perform bad. 4090 represents 1% of the gpu market.

Who gives a crap about raster? The are ahead BECAUSE of DLSS, Framegen and RT

that is what i mean when i say they are generations ahead. Duh
 

Wolzard

Member
Yes, most games make limited use of RT, but you don't need PT to fix the biggest problems with raster lighting, mainly ugly and distracting screen space reflections that fade as you move the camera up and down, cascading shadows that draw literaly in front of the character, or the lack of indirect shadows in sandbox games (without indirect shadows lighting will always look flat). I do not need to use image reconstruction in standard RT games on my PC, at least at 1440p, but I see no reason why I shouldnt when DLSS looks so good.

Raster

raster.jpg


RT low

RT-reflections-shadows.jpg


Path Tracing

PT.jpg



Raster


raster.jpg


RT low settings


RT-shadows-reflections.jpg


Path Tracing

PT.jpg


And by the way. DLSS inst exactly upscaling. Upscaling simply resizes the image to a higher resolution, but cannot add new details/restore missing detail, whereas DLSS does exactly that based on temporal data. The upscaled image can never match the native resolution, wile the image reconstruction such as DLSS can match and even surpass it, because Temporal data (data from previous frames) provides more detail than single frame rendered at native resolution. DLSS (especially 3.8.2) can merge previous frames very well, and having used DLSS in countless games, I have to say that native resolution is currently the biggest waste of GPU resources. I can always get better image quality with DLSS + DLDSR combo and still get few fps more compared to the native TAA. Even the new PSSR in PS5Pro, if implemented well, can provide better image quality compared to the native TAA (e.g. 'stellar blade').

It solves some things and brings others, such as artifacts due to the low resolution used for reflections.
In some cases, even though RT brings a realistic result, it is not always the most beautiful.

sddefault.jpg
egviy4o79wf71.png


In some cases the reflection doesn't make any sense, like this super polished wood.

hq720.jpg


This will be fixed over time, both Nvidia and AMD are working on solutions for this.

 

Crayon

Member
Outlaws min requirement cards don't have ray tracing. So seems like it's not technically required. Searching shows that it can only be turned off via the cfg file. What's going on with this one?
 

Wolzard

Member
Outlaws min requirement cards don't have ray tracing. So seems like it's not technically required. Searching shows that it can only be turned off via the cfg file. What's going on with this one?

Ray tracing can be performed on anything that computes. Normally when you have specific hardware, it is to speed up the processing of this effect.



In the Crysis remaster released last generation, there were RT effects.

 
It solves some things and brings others, such as artifacts due to the low resolution used for reflections.
In some cases, even though RT brings a realistic result, it is not always the most beautiful.

sddefault.jpg
egviy4o79wf71.png


In some cases the reflection doesn't make any sense, like this super polished wood.

hq720.jpg


This will be fixed over time, both Nvidia and AMD are working on solutions for this.

I can tell you are not speaking from your own experience because your comments about RT are either not true, or you are exaggerating issues. I've played quite a few RT games on PC and usally SSR's have way lower resolution and way more distracting artifacts (objects fading as you move, disocclusion artifacts, shimmering). Even low quality RT reflections in the Resident Evil 3 remake looked much better to me than the SSR (while having close to nothing performance cost).
 
Last edited:

Crayon

Member
Ray tracing can be performed on anything that computes. Normally when you have specific hardware, it is to speed up the processing of this effect.



In the Crysis remaster released last generation, there were RT effects.



Okay so the game/snowdrop has a software fallback. Imo that makes it ambiguous whether you'd call that rt required or not. Depends how bad that minimum spec looks/runs.
 

Wolzard

Member
I can tell you are not speaking from your own experience because your comments about RT are either not true, or you are exaggerating issues. I've played quite a few RT games on PC and usally SSR's have way lower resolution and way more distracting artifacts (objects fading as you move, disocclusion artifacts, shimmering). Even low quality RT reflections in the Resident Evil 3 remake looked much better to me than the SSR (while having close to nothing performance cost).

Just because you don't see them doesn't mean they don't exist. Perception varies from person to person. I played several games with RT and I see these problems, in the same way that I see differences in resolution, framerate, anti-aliasing, graphic effects, etc. This must be the case with Digital Foundry journalists, for example. I have good eyesight and I always see these things. Depending on the game, I simply ignore it to enjoy the experience, but I don't fail to notice the existence of these defects. Furthermore, I study a little graphic programming as a hobby, I'm used to noticing that.

Okay so the game/snowdrop has a software fallback. Imo that makes it ambiguous whether you'd call that rt required or not. Depends how bad that minimum spec looks/runs.

RT is informed in the specs, but does not say about the hardware requirement for this. The first game to require the use of hardware to accelerate RT was Indiana Jones.

SWO_PC_Specs_Flashcard_EN_FINAL.jpg
IJ-PC-SystemSpecs-4K-6column-EN.jpg
 
Just because you don't see them doesn't mean they don't exist. Perception varies from person to person. I played several games with RT and I see these problems, in the same way that I see differences in resolution, framerate, anti-aliasing, graphic effects, etc. This must be the case with Digital Foundry journalists, for example. I have good eyesight and I always see these things. Depending on the game, I simply ignore it to enjoy the experience, but I don't fail to notice the existence of these defects. Furthermore, I study a little graphic programming as a hobby, I'm used to noticing that.
I'm not saying RT is perfect in its current state, especially on consoles. RT noise is much more noticeable on consoles because developers use lower ray counts and lower internal resolution. Add to that FSR2 / PSSR artifacts and you have a very noisy image in games that use RT.

On a PC, however, the RT noise problem is so small that I don't even notice it most of the time. I noticed noise problems in few PT games, especially PT mods to Quake 2, Half Life 2, or in UE5 games that use software RT (lumen), but in most games that use standard hardware RT I really need to look for issues to see some imperfections, and even if I see them they are too small to bother me (in the witcher 3 I havent noticed any noise issue, either with RT GI, or reflections).

In the Cyberpunk RT, for example, reflections look absolutely stunning most of the time (especially pixel-perfect reflections on car windows), but diffuse reflections on wet surfaces can show some minor problems such as 'boiling' (reflections on wet surfaces arnt perfectly stable). That "boiling" is there and I can see it, but it isnt big enough to bother me. If I however turn off RT, reflections are either missing, or have even lower resolution compared to RT. What's more SSR artifacts are too big to ignore. It's better to have unstable reflection on the wet surfacte than SSR's that fade out during movement, or show disocclusion artifacts when charcters / objects moves. The RT reflections in cyberpunk look stunning to me, whereas I want to vomit every time I turn RT off. Based on this reaction I can say that RT is absolutely worth it.

I respect Digital Foundry. These guys can notice RT issues more easily compared to normal gamers, but at the end of the day they still love to play games with RT on PC, because even imperfect RT is many times better than raster.
 
Last edited:

Bojji

Member
If it comes close to a 4080, whilst also being 40% cheaper then it would be a huge hit… but maybe I’m being too optimistic.

Close to 4080 in "select gaming titles" so probably games where AMD already performs better relative to Nvidia (like cod).

Before AMD GPU launch we have always some very positive rumors, 90% of them end up bullshit. Expect that rumored 4070S performance, if it's better - great?
 
Last edited:

Bojji

Member
it's China, so they love testing more pro NV games

They do? I don't know anything about them, maybe they are right - maybe not.

Previous leak was from time spy, of course it's synthetic bench but it's quite platform agnostic.
 
Last edited:

winjer

Gold Member
Top of the line for each manufacturer should be used to compare IMO

The point of comparison is the price. The 4080 and the 7900XTX have similar prices.
The 7900XTX does have a slight edge is rasterization. But then Nvidia has a big advantage in both Ray Tracing, AI and DLSS.
With all these deficits, the 7900XTX should be significantly cheaper.

Edit: the 4080 also uses 50W less. It's not much in this GPU range, but it's something to consider.
 
Last edited:

AGRacing

Member
They’ve named the thing 9070xt. It’s clear what they’re going after.

I hope they can compete properly with a 5070 TI. Or at least come close and price it PROPERLY.

It seems like they’ll kick off with some kind of AI upscaling competition (and hopefully that means some relevant game patches RIGHT AWAY).

They just have to get on the ball and put in an intel like effort more consistently. Somebody over there doesn’t understand that when you’re moving a boulder the first inches require vastly more effort.
 

Gaiff

SBI’s Resident Gaslighter
Seems like an incredibly stupid decision to release a weaker card - unless they're planning to sell this at $400.
We’ve known this for monts. The prevailing rumor was that their next line-up wouldn’t feature any card faster than the 7900 XTX. Fast forward a few months later and AMD confirmed they were bowing out of the top-end for RDNA4.

This card aims to compete with NVIDIA’s mid-range, so a competitor to something like the 5070.
 

SABRE220

Member
Last edited:
We’ve known this for monts. The prevailing rumor was that their next line-up wouldn’t feature any card faster than the 7900 XTX. Fast forward a few months later and AMD confirmed they were bowing out of the top-end for RDNA4.

This card aims to compete with NVIDIA’s mid-range, so a competitor to something like the 5070.
If it isn't cheap then they shouldn't even bother releasing a GPU this time around. Also would be interesting if they have AI/ML based FSR.
 
Top Bottom