• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RX 7900XTX and 7900XT review thread

GHG

Member
Good man.

I had similar crashing issues with my 3080 12GB. Random crashes in RDR2, all 5 different Matrix demos and even cybperunk. Nothing worked. ended up getting a 10 GB and while the Matrix demo still crashes, doesnt do it as often and RDR2 and Cyberpunk no longer crash. It was probably just the card itself.

With 3080s it's the memory temperatures you've got to keep an eye on.
 

Puscifer

Member
Solid GPUs.
Very much, I like Nvidia but they've clearly lost it over the years. I'm just being honest and say that age old driver rumors are long gone, the and if weren't for the fact I got a GPU during a drop at best buy at the *RIGHT* moment of looking for a usb cable and snagging my 3070 at MRSP during the shortage, Id have to seriously wonder exactly people think they're getting for the prices they're paying. Seriously, go AMD especially if you're using one of their processors.
 

Deanington

Member
PSA: If youre having heating issues with reference cards contact AMD support or their RMA site.

Here is the Reddit post for source and AMD links.
 

AGRacing

Member
PSA: If youre having heating issues with reference cards contact AMD support or their RMA site.

Here is the Reddit post for source and AMD links.


Is anyone in this thread getting this issue while leaving the power limit slider alone?

I'm seeing WILDLY different methods and numbers for OC and Undervolt of this card. As an example... My junction temp stays in the low 80s when the card is undervoltrd to 1070. Fans at about 2200 rpm. But clocks will be always higher than quoted boost clocks in this scenario. 2900mhz+ in Fortnite. 2700s in RT Cyberpunk and most other games that I believe would be considered "heavy" load.

But if I use those exact same settings and increase board power 15%.... Nothing really changes at all except for temps. Mid 90s for junction. Clocks seems largely the same.

Most people who bought this card or have reviewed it (myself included) automatically assumed the first thing you'd do to it is increase the power limit . This thinking is based on RDNA2 and my 6900 XT definately benefitted by doing this.... But this card doesn't seem to.

So for now my reference settings for the 7900 XTX are:
1070 voltage.
2400-3300 clocks
Ram timings normal
2714 ram (achieves 2700).
Fans constant 2200 rpm.

This will score about 17000 in Port Royale. And achieve 2950 avg clocks in Fortnite as described. Settings maxed. Hardware RT on.
 

SatansReverence

Hipster Princess
Flipped the bios chip and set it to oc, hasn’t crashed since. What odd behavior but fuck I’ll take it. I’ll have to test it more on a day off but played cyber punk easily 4 times as long as normal and didn’t fuck up. Looks like 7900xt is staying
You'll also claw back a decent chunk of the perfomance of the reference XTX running an XT AIB in OC mode. Good to know that the problem was sorted though!
 
Very much, I like Nvidia but they've clearly lost it over the years. I'm just being honest and say that age old driver rumors are long gone, the and if weren't for the fact I got a GPU during a drop at best buy at the *RIGHT* moment of looking for a usb cable and snagging my 3070 at MRSP during the shortage, Id have to seriously wonder exactly people think they're getting for the prices they're paying. Seriously, go AMD especially if you're using one of their processors.

We'll have to see how the rest of the lineup plays out, but it does look like AMD might not have lost the plot quite as bad as Nvidia at the moment. It will be interesting to see where the 7600/7700 lines hit as far as price/performance.
 

SatansReverence

Hipster Princess
Is anyone in this thread getting this issue while leaving the power limit slider alone?

I'm seeing WILDLY different methods and numbers for OC and Undervolt of this card. As an example... My junction temp stays in the low 80s when the card is undervoltrd to 1070. Fans at about 2200 rpm. But clocks will be always higher than quoted boost clocks in this scenario. 2900mhz+ in Fortnite. 2700s in RT Cyberpunk and most other games that I believe would be considered "heavy" load.

But if I use those exact same settings and increase board power 15%.... Nothing really changes at all except for temps. Mid 90s for junction. Clocks seems largely the same.

Most people who bought this card or have reviewed it (myself included) automatically assumed the first thing you'd do to it is increase the power limit . This thinking is based on RDNA2 and my 6900 XT definately benefitted by doing this.... But this card doesn't seem to.

So for now my reference settings for the 7900 XTX are:
1070 voltage.
2400-3300 clocks
Ram timings normal
2714 ram (achieves 2700).
Fans constant 2200 rpm.

This will score about 17000 in Port Royale. And achieve 2950 avg clocks in Fortnite as described. Settings maxed. Hardware RT on.
I'm getting a decent bump from upping the power limit.
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)


Yikes, 7900 XTX is drawing a lot more power vs. 40-series at 1440p. Hoping driver updates can fix this.
Very much, I like Nvidia but they've clearly lost it over the years. I'm just being honest and say that age old driver rumors are long gone, the and if weren't for the fact I got a GPU during a drop at best buy at the *RIGHT* moment of looking for a usb cable and snagging my 3070 at MRSP during the shortage, Id have to seriously wonder exactly people think they're getting for the prices they're paying. Seriously, go AMD especially if you're using one of their processors.
Makes no sense to buy an AMD card if you value Ray Tracing performance. That alone prevents me, an actual PC gamer who cares about getting the best RT performance for my money, from buying an AMD GPU.

If you don't care about RT, sure I could see buying the XTX but only if you aren't concerned with the sometimes extra 50% (possibly even more under some scenarios) power draw vs. RTX 4080.

And the fact that DLSS is better than FSR2, and already has frame generation, and the fact that AMD cards can't really run Portal with RT, while the RTX 2060 (which was once the slowest RT card) can.
 
Last edited:

AGRacing

Member
Junction temps get upto 85-90c.

Been able to run 3dmark as low as 1010mv. Perfectly stable is probably 1030mv

Clocks upto around 3ghz.
Interesting.. I'll try 1030 soon.

At 1070... There is no difference in performance above a 5% board increase. Just more heat. There is one, however, between 0 and 5%. So beyond 365W I see nothing except higher temps at 1070. Very different behavior vs. 6900XT.
 

MikeM

Member
Very much, I like Nvidia but they've clearly lost it over the years. I'm just being honest and say that age old driver rumors are long gone, the and if weren't for the fact I got a GPU during a drop at best buy at the *RIGHT* moment of looking for a usb cable and snagging my 3070 at MRSP during the shortage, Id have to seriously wonder exactly people think they're getting for the prices they're paying. Seriously, go AMD especially if you're using one of their processors.
Agreed. Had a 6700xt and never had any issues. Now have a 7900xt and besides the Witcher 3, no issues.
 

Topher

Identifies as young
Very much, I like Nvidia but they've clearly lost it over the years. I'm just being honest and say that age old driver rumors are long gone, the and if weren't for the fact I got a GPU during a drop at best buy at the *RIGHT* moment of looking for a usb cable and snagging my 3070 at MRSP during the shortage, Id have to seriously wonder exactly people think they're getting for the prices they're paying. Seriously, go AMD especially if you're using one of their processors.

There are a couple of trade offs. AMD is cheaper and on par as far as rasterization goes. But we all know RT and DLSS are Nvidia's strengths. Are those strengths worth $200? Not to me. So the answer is to go with the best deal at the time. For me, it was 4080 FE but with a 10% discount which is probably the price it should have been all along.

But you said especially if you are using AMD processors. Why is that?
 

RoboFu

One of the green rats


Yikes, 7900 XTX is drawing a lot more power vs. 40-series at 1440p. Hoping driver updates can fix this.

Makes no sense to buy an AMD card if you value Ray Tracing performance. That alone prevents me, an actual PC gamer who cares about getting the best RT performance for my money, from buying an AMD GPU.

If you don't care about RT, sure I could see buying the XTX but only if you aren't concerned with the sometimes extra 50% (possibly even more under some scenarios) power draw vs. RTX 4080.

And the fact that DLSS is better than FSR2, and already has frame generation, and the fact that AMD cards can't really run Portal with RT, while the RTX 2060 (which was once the slowest RT card) can.

Umm both cards are in the 3090 rt range.

I can run the nvidia optimized portal game with only a few issues but at around 60 fps with my xt .. only volumetric lighting is very very buggy.

The power issue is valid though I’m almost always in the 300s
 

SatansReverence

Hipster Princess
Interesting.. I'll try 1030 soon.

At 1070... There is no difference in performance above a 5% board increase. Just more heat. There is one, however, between 0 and 5%. So beyond 365W I see nothing except higher temps at 1070. Very different behavior vs. 6900XT.
Silicon lottery can be a bitch though.
Agreed. Had a 6700xt and never had any issues. Now have a 7900xt and besides the Witcher 3, no issues.
I've managed to make TW3 playable by using RTSS to cap the frame rate between 40 and 45 running native res with no FSR. It definitely needs optimised drivers or a patch to fix it though, the frame pacing and stutter when uncapped is egregious.

*edit* and also launch the game from the exe directly instead of the launcher
 
Last edited:

AGRacing

Member
Silicon lottery can be a bitch though.

I've managed to make TW3 playable by using RTSS to cap the frame rate between 40 and 45 running native res with no FSR. It definitely needs optimised drivers or a patch to fix it though, the frame pacing and stutter when uncapped is egregious.
I tried 1030. Was able to bench fine. Had a gameplay crash. Currently at 1050.
(edit.. my clock range is also 2400-3300)

Honestly I think the performance and clocks I'm getting at these temps are excellent. The testing at 1050 is a slight improvement over 1070. 3000+ MHz is noticeably more common in gameplay and junction temp under 85 with fans at 2200rpm.

Id be very curious to see what YOUR card does at +5% board power vs +15% in both performance and heat output. If it behaves like mine you'll see no real performance loss but a significant temperature drop.
 
Last edited:

Buggy Loop

Member


Yikes, 7900 XTX is drawing a lot more power vs. 40-series at 1440p. Hoping driver updates can fix this.


Nearly 100W difference on average between the 7900XTX and 4080 o_o

That's money via electric bills in the long run o_o;

Where the hell did AMD came up with 54% efficiency gains just 1 month ahead of launch. That marketing team was high.

Shame about 4080-7900XTX pricing.
 
Last edited:

SatansReverence

Hipster Princess
I tried 1030. Was able to bench fine. Had a gameplay crash. Currently at 1050.
(edit.. my clock range is also 2400-3300)

Honestly I think the performance and clocks I'm getting at these temps are excellent. The testing at 1050 is a slight improvement over 1070. 3000+ MHz is noticeably more common in gameplay and junction temp under 85 with fans at 2200rpm.

Id be very curious to see what YOUR card does at +5% board power vs +15% in both performance and heat output. If it behaves like mine you'll see no real performance loss but a significant temperature drop.
testing 0% - 5% - 10% and 15% netted linear performance bumps each step and thermals increasing exponentially at higher percentages. starting at 81c at 0% 83c at 5% upto 89c at 15% this was at 1015mv in port royal.

I can't remember the settings but I did get over 18000 in port royal last night. getting 17500ish today.
 

winjer

Member

Within such tolerance ranges, it is certainly possible to bypass the stoppers, use suitable screws with larger heads and neutralize the protruding sleeves by means of higher washers, so that when screwing with new, good paste, the contact pressure can be better dosed. Since I’m not AMD and also don’t know the maximum permissible values, I’ll leave a question mark here for safety reasons. Bending hollow chip surfaces with force just to make them fit the crooked and curved heatsink is certainly associated with a certain risk. This, however, is not something I want to impose irresponsibly on any of my readers. Everyone must then decide for themselves.

However, with a suitable thermal paste (Alphacool Apex B-Stock, so the firmer), two washers per hole and suitable screws instead of the clamping cross, I could also lower the hotspot that occurred with me even compared to the “normal” card to a slightly lower delta of 15 to 17 Kelvin, which has not worked by simply changing the paste. I take this for me personally as a conclusion and still explicitly refer to the preface on page one. It was the comprehensible and also measurable reason for my card, but it does not necessarily have to be for other cards.

It is certainly also conceivable that individual sensors in the GPU measure mischief and the telemetry does not catch such outliers (plausibility test). This could also be solved by AMD via a firmware update. However, today’s test speaks against this (at least for me), where the temperatures of the six chiplets suddenly turned out to be almost the same again after the reassembly without the clamping cross, where there was still up to 8 Kelvin difference before and my card even performed better than the original without the hotspot problem. In any case, it is advisable to wait for a statement from AMD. I for one have at least been able to solve my problem.

By the way, I also had the slight difference between vertical and horizontal mounting with the “repaired” card and two other, rather unsuspicious models, which could clearly be blamed on the vapor chamber and cooler design as such. But I don’t think that had anything directly to do with the hotspot, which is why it’s actually a different topic. If anyone is experimenting around with their RX 7900 XT(X) or still finding problems, I would be very happy to hear from you. In the meantime, I’ll wait and see what the swarm knowledge and AMD bring to light (or not).
 

AGRacing

Member
I left clocks default and went lower with voltage... did return better results.

vvhXSvn.jpg
 

GreatnessRD

Member
I got mad love for AMD, but this 7900 series launch has been kinda a disaster. It makes it worse when the marketing was making fun of Nvidia and the power connector. Now you see their GPUs can heat households and might have to be recalled. You just hate to see it.
 

winjer

Member
The very definition of "chat shit, get banged".

Everything about this release seems rushed.

For once, it seemed AMD had made a good reference cooler.
Sadly, it's very lacking. Unless people use it vertically.
For people considering buying an AMD card, it's better to stick to custom coolers from AIBs.
 
Last edited:
  • Like
Reactions: GHG

AGRacing

Member
Good results, is that with the power limit increased?
It is, yeah.

I've got a lot of work ahead of me fine tuning this thing though.

I have a feeling driver update is imminent so I'll probably wait.
Forza Horizon 5 would not run at these settings. It seems to be the game that has the largest problem with undervolting too much. Cyberpunk would be second most likely to crash and Fortnite 3rd most likely based on undervolt. To keep all 3 of these games stable a voltage of 1080 may be required. At that voltage I still think it may there may be some balance between clock range and % increase in board power that will be a better idea than just maxing out board power at default clocks. But again it'll take a lot of playing around.

"Ancient Gameplays" (AMD enthusiast on YouTube) believes he needs 1120 to be stable.

But now we know about vertical vs horizontal mounting and the temperature problem on some cards (I'm mounting vertical) that may be the biggest determining factor on affected cards.
 
Last edited:

Warnen

Don't pass gaas, it is your Destiny!
Amd saw that original PlayStation folks flipped there systems upside down to work, they just wanted some of the tilted action…
 
A benefit of being cheap is you rarely have to work with parts that try to melt themselves. :messenger_tears_of_joy:

Hopefully they get the bottom of the stack figured out.
 

//DEVIL//

Member
it's funny how AMD was making fun of Nvidia for its cable choice when they have a rushed over-promised and under-delivered with hardware fault at manufacturing.

people were jumping all over Nvidia for a cable choice that wasn't even their fault and it was not an issue of Nvidia in the first place. they made it sound so big of a deal and all the AMD mosquitos and youtube drama queens asking for a recall. where are they now ? all the sudden not even a single burnt cable has been added even in the mega Reddit thread.

AMD shouldn't Bark like a small dog when they didn't have shit and quite frankly horrible engineering team that can't even do a proper vapor chamber.

I am not touching AMD products whatever its CPU that reaches 95 degrees as soon as you see a game logo or this shit of a video card. they are just horrible really. I do not care about that 5 or 10 % performance gain if they ever have that ( which they don't ).


Really sucks too. because I was rooting for AMD after the first claim of performance ( which was a total lie ). I really wanted a card that while not as powerful as 4090, it would be close for 1000$ would be great ). but it can't even beat the 4080. they just trade blows at each other depending on the game.
 

Kataploom

Gold Member
The ray tracing in Metro Exodus really isn't that heavy. Try Cyberpunk or Dying Light 2.

RT_2.png

RT_1.png
That's probably because those games lighting weren't designed around raytracing, Metro lighting system was completely reworked for raytracing so it gives a better picture for future games using mainly or just raytracing for their lighting
 

//DEVIL//

Member
Fair, but is it not about $200 cheaper?
Not the case anymore.

the 4080 FE is still one of the best cards you can buy. For the 7900xtx you will need something higher than the reference due to a shitty cooler flaw. so you are talking 1100?

That is a 100 $ difference so no. not worth it. Honestly speaking? even at 200$ difference, I still find the 4080 a better card ( not saying the price is justified. it's expensive) but for 200$, I will get the option to stuff I do not have access to in an AMD card like DLSS, Frame Generation and a way better ray tracing? I honestly would pay that difference but that is just me and my personal opinion. Otherwise, I feel like I spent premium money on a card missing some important things just because I cheapened out a little and I would honestly lose sleep thinking about it.
 

Panajev2001a

GAF's Pleasant Genius
Not the case anymore.

the 4080 FE is still one of the best cards you can buy. For the 7900xtx you will need something higher than the reference due to a shitty cooler flaw. so you are talking 1100?

That is a 100 $ difference so no. not worth it. Honestly speaking? even at 200$ difference, I still find the 4080 a better card ( not saying the price is justified. it's expensive) but for 200$, I will get the option to stuff I do not have access to in an AMD card like DLSS, Frame Generation and a way better ray tracing? I honestly would pay that difference but that is just me and my personal opinion. Otherwise, I feel like I spent premium money on a card missing some important things just because I cheapened out a little and I would honestly lose sleep thinking about it.
I get you like your nVIDIA cards fine ;), but you said you were DISAPPOINTED it could not even beat the 4080… somehow does not compute. You answer now just brings some nVIDIA pr material (DLSS has a lot of equivalent features found in the open source FSR feature AMD has been championing, including frame generation, which is not perfect on nVIDIA at the moment either but promising sure) and personal preference, but it is not addressing why you expected a card that was ~$200 cheaper to easily best the more expensive card…
 

//DEVIL//

Member
I get you like your nVIDIA cards fine ;), but you said you were DISAPPOINTED it could not even beat the 4080… somehow does not compute. You answer now just brings some nVIDIA pr material (DLSS has a lot of equivalent features found in the open source FSR feature AMD has been championing, including frame generation, which is not perfect on nVIDIA at the moment either but promising sure) and personal preference, but it is not addressing why you expected a card that was ~$200 cheaper to easily best the more expensive card…
Simple. Their first announcements of the card and their graph. Showed a performance level very close to the 4090. Especially with games like cyber punk graph . They said it’s 1.5 minimum and 1.7 performance uplift of their 6950 card when in reality it’s barely 30% performance uplift. That’s why I was routing for them. Who doesn’t want a close to 4090 level for 600$ cheaper. Fuck ray tracing if I am getting close performance in resta for 600$ cheaper.

I don’t favor a company over the other . I always mention this. I do not have a loyalty to a company that takes money from me :/ .
 
Last edited:

AetherZX

Member
Not the case anymore.

the 4080 FE is still one of the best cards you can buy. For the 7900xtx you will need something higher than the reference due to a shitty cooler flaw. so you are talking 1100?

That is a 100 $ difference so no. not worth it. Honestly speaking? even at 200$ difference, I still find the 4080 a better card ( not saying the price is justified. it's expensive) but for 200$, I will get the option to stuff I do not have access to in an AMD card like DLSS, Frame Generation and a way better ray tracing? I honestly would pay that difference but that is just me and my personal opinion. Otherwise, I feel like I spent premium money on a card missing some important things just because I cheapened out a little and I would honestly lose sleep thinking about it.

Pretty much exactly the reason I caved in and bought a 4080 yesterday. XTX AIB cards were in spitting difference price-wise and unfortunately I care more for the overall capabilities of a GPU rather than just rasterization.
 

Warnen

Don't pass gaas, it is your Destiny!
Was able to test the 7900xt more today and yeah still crashing out after a while. Last longer if I oc it but get so loud makes my pc sound like a ps2.

Think gonna swap with a 4080 on Wednesday, just better bang for the buck.
 
Last edited:
Top Bottom