• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD RDNA 4 GPUs To Incorporate Brand New Ray Tracing Engine, Vastly Different Than RDNA 3

Bojji

Member
I mean, what day / time ?

hM83mtf.jpeg
 
The RX 9070 will be an intermediate card, like a 4070/5070, which has/will have 12 GB.
Regarding the use of VRAM, in 4K there must be at least 16GB of VRAM. In the PS6 generation, it will probably be much worse, as the console is expected to have 32 GB of RAM.

vram.png
avatar-frontiers-of-pandora-vram.png


vram.png
Havent't PS5Pro taught you guys to temper your expectations :p? If the PS6 does not run PT well (the most likely scenario), then 32GB of RAM would be a total waste of resources. Even the PS5 can stream absurd amounts of texture data on the fly, and I doubt developers will ask Sony to double the system memory in the PS6.

My GPU has 16GB of VRAM, but my PC also has 32GB of system RAM. That's 48GB in totall. Even if GPU will run out VRAM PC can also use system RAM. In the test below, the RX 5500XT's 4GB was VRAM limited, but with a PCIe4 slot, performance was comparable to the 8GB version. My motherboard has PCIe5, but my current GPU only support PCIe4.


eAiJwyN.jpeg



Games like Indiana Jones with PT, SW outlaws RTXDI (PT), and Alan Wake 2 with PT are extremely demanding at maxed out settings, especially at 4K native, and with FG on top of that. My card would not even run these settings smoothly with 32GB of VRAM. When I however adjust settings to get smooth fps, then my RTX4080S will be not limited even in these PT games, therefore I cant say I'm loosing much. Also keep in mind, VRAM allocation does not mean that the GPU uses as much.

4K DLAA (native) + PT, 14GB VRAM allocation, and 13GB real VRAM usage. 20fps isnt good enough :p, even though this is very demanding location.

4-K-DLAA-PT.jpg


With frame generation VRAM usage increased to 14GB, but still these settings arent really usable.

4-K-DLAA-FG-PT.jpg


With DLSS Q + FG the game becomes finally playable (at least on gamepad), real VRAM usage 12.5GB

4-K-DLSSQ-FG-PT.jpg


4K DLSS Performance + FG, PT, real VRAM usage 12GB. The game is perfectly playable at these settings now.

4-K-DLSS-P-FG-PT.jpg


4K DLSS Quality, Raster (for people who hate RT, even though the game still use software RT even without PT :p), just 10GB real VRAM usage.

4-K-DLSS-Q-raster.jpg


With FG VRAM usage increased to 11GB.

4-K-DLSSQ-FG-raster.jpg


At 1440p DLSSQ + FG with path tracing real VRAM usage is around 10GB.

1440p-DLSSQ-FG-PT.jpg


As 1440p DLSS Q + FG with Raster VRAM usage is just 9.5GB VRAM

1440p-DLSSQ-FG-raster.jpg



Alan Wake 2 is one of the most VRAM hungry games, because it support PT and FG, yet I cant say that VRAM was a problem for me, because with playable settings the game use just 9-12GB VRAM. The RX7900XTX has more VRAM, but you cant play PT games on this card, so more VRAM would be of no use to the AMD folks anyway.
 
Last edited:

Bojji

Member
N48 have 2x16GB cards and one 12gb


it's can be 9070 is Gre + 5%, 9070XT ~4080

Based on that time spy score we have this:

RX-9070-XT-vs-other-GPUs-in-Time-Spy.png


7900XT will be faster and 4080/7900xtx will be MUCH faster.

Rumors are all over the place, 4080 is over 30% faster than GRE.
 

Bojji

Member
we not only timespy have, we also have rumor 4080 -5% for reference model
As you see we don't have any leak about RDNA4 and RTX 5xxx series. Both companies hiding well

We will see what is the truth quite soon.

But still, all we will get (at most) are some slides. Last time with 7900xtx AMD was making shit up.
 

Wolzard

Member
Most likely waiting to see what Nvidia announces and pricing to make sure they don't shoot themselves in the foot 7 hours before. Why give Nvidia the heads up?

They talked about the 9950X3D and didn't mention the price. It could have been the same with GPUs.
 
So this one tells us all. Probably somewhere between 7900GRE and 7900XT. One tier behind XTX and 4080.
I wasn't expecting RDNA4 cards to match the high-end RTX 50 series cards, but two years have passed since nvidia launched the RTX4080, so I thought something like the RX 9070XT would be finally able to match nvidia RTX4080 in both raster and RT.

If the 9070XT is going to be only as fast as 7900GRE and just more advanced (faster RT and ML FSR), people who may want to buy an AMD card may have a dilemma of whether to go with the more advanced but slower 9070XT or the faster raster card 7900XTX.

Let's hope AMD has some faster RDNA4 GPUs planned to launch in 2025, because the 9070XT looks like something that will only appeal to mid-range gamers (and not even upper mid-range).
 
I like the design, these cards look beautiful, but looks can be deceving 😁. I met women who looked beautiful, but I did not want to spend my time with them, because they were lacking in other areas.
 

Wolzard

Member
With RDNA 4, AMD claims generational SIMD performance increase on the RDNA 4 compute units. The 2nd Gen AI accelerators will boast of generational performance increase, and AMD will debut a locally-accelerated generative AI application down the line, called the AMD Adrenalin AI, which can generate images, summarize documents, and perform some linguistic/grammar tasks (rewriting), and serve as a chatbot for answering AMD-related queries. This is basically AMD's answer to NVIDIA Chat RTX. AMD's 3rd Gen Ray accelerator is expected to reduce the performance cost of ray tracing, by putting more of the ray tracing workload through dedicated hardware, offloading the SIMD engine. Lastly, AMD is expected to significantly upgrade the media acceleration and display I/O of its GPUs.

We also got our first peek at what the "Navi 48" GPU powering the Radeon RX 9070 series looks like—it features an unusual rectangular die with a 2:1 aspect ratio, which seems to lend plausibility to the popular theory that the "Navi 48" is two "Navi 44" dies joined at the hip with full cache-coherency. The GPU is rumored to feature a 256-bit GDDR6 memory interface, and 64 compute units (4,096 stream processors). The "Navi 44," on the other hand, is exactly half of this (128-bit GDDR6, 32 CU). AMD is building the "Navi 48" and "Navi 44" on the TSMC N4P (4 nm EUV) foundry node, on which it is building pretty much its entire current-generation, from mobile processors, to CPU chiplets.

 

SolidQ

Member
If the 9070XT is going to be only as fast as 7900GRE
isn't AMD tease us 7900XT?

Not only 9070

Tim Schiesser – Hardware Unboxed

So with the FSR4 slide, there was an interesting footnote talking about some sort of feature that's going to be exclusive to the 9070 XT. Sites like Videocardz interpreted that to mean that FSR4 would be exclusive to the 9070 XT or 9070 series products (compared to 9060). Is that accurate reporting?

David McAfee

It's more just those top tier platforms. The content that was shown during the CES presentation was on the 9070 XT, which is why the footnote said that specifically. There is nothing about FSR4 that makes it exclusive to a single model. It is very much architecture optimized at this point in time.

One of the major advancements in RDNA 4 is the MLOps being able to drive compute efficiently through the graphics engine is a massive generational improvement. I think what you should expect from AMD is as we roll out FSR4 is that it will lean into those capabilities of the Navi4 architecture.


69b4eaf717f75b8848b6d8c0948125d2.png
 
Last edited:
I thought there was something wrong with these leaks when I saw the size of these cards. These 9070XT cards are absolutely massive, comparable in size to the 7900XTX and RTX4080, and why you would need such a huge cooler for a mid-range card? The 7900GRE is relatively small compared to 9070XT.

Maybe the 9070XT will be as fast as RTX4080 after all.
 
Last edited:

Xyphie

Member
I thought there was something wrong with these leaks when I saw the size of these cards. These cards are absolutely massive, comparable to 7900XTX and RTX4080 cards, and you dont need so huge cooler for midrange card. The 7900GRE is relatively small compared to 9070XT.

There's no relationship between the size of a card and how it performs really. GPU vendors will happily sell you a 4060 with a 3-fan triple-slot cooler with 130W TBP.
 
Last edited:

Wolzard

Member
I thought there was something wrong with these leaks when I saw the size of these cards. These 9070XT cards are absolutely massive, comparable in size to the 7900XTX and RTX4080, and why you would need such a huge cooler for a mid-range card? The 7900GRE is relatively small compared to 9070XT.

Maybe the 9070XT will be as fast as RTX4080 after all.

They must do a lot of reusing other models, including Nvidia's, to have just one production line.
I remember Gigabyte's RX 6600 Eagle, which has 3 fans, but the PCB is half the board. Besides, it is a low-end GPU that consumes/heats up very little.

RX 6600 / RTX 4060

card3.jpg
card3.jpg
 

Wolzard

Member
AMD still has the possibility of surprising, if it does its homework well. But that's precisely the problem, AMD always makes ugly and rude mistakes.

If the 9070 XT is capable of being at the level of the 5070, they will want to charge 499 dollars... :messenger_unamused:
 

StereoVsn

Gold Member
AMD still has the possibility of surprising, if it does its homework well. But that's precisely the problem, AMD always makes ugly and rude mistakes.

If the 9070 XT is capable of being at the level of the 5070, they will want to charge 499 dollars... :messenger_unamused:
I mean, isn’t this general AMD motto? Take Nvidia card pricing close in raster performance and take 10% off the MSRP?

And then make a Pikachu face when that sales scheme fails.
 

Gaiff

SBI’s Resident Gaslighter

Same ballpark as a 7900 XTX when it comes to Time Spy Extreme and Speed Way. Even slightly faster despite a potential driver bug. That's significantly better than the 7900 XT or below rumors we had. With FSR4, this makes those cards (the 9070 XT/X at least) interesting options in the high end.

Is this the 9070 or XT/X?
 
Last edited:

Wolzard

Member
Same ballpark as a 7900 XTX when it comes to Time Spy Extreme and Speed Way. Even slightly faster despite a potential driver bug. That's significantly better than the 7900 XT or below rumors we had. With FSR4, this makes those cards (the 9070 XT/X at least) interesting options in the high end.

Is this the 9070 or XT/X?

With 4096 shader cores it should be the 9070 XT.
 

Bojji

Member



I like that it's blurred everywhere but GPU z says 7800XT.

GPU has almost the same specs as 7800XT (on the right) other than core clock:

gE4oe6g.jpeg


Is higher clock enough to overcome differences in shader count and big memory BW difference between 9070XT and 7900XT/XTX?

We will see. AMD performance rumors don't have the best track record...
 
You really don't understand memory bandwidth is a result of the bus width and memory clock speed.

I would've thought the smirk gave away I was poking fun but ok, Let's talk seriously, now. For the entrapment, we're gonna have to ask you for four big ones. Four-thousand dollars for that. But we are having a special this week on proton charging and storage of the beast, and that's only going to come to one-thousand dollars, fortunately.
 
Top Bottom