• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD: Radeon 7900XTX ($999) & 7900XT ($899) Announced | Available December 13th

Crayon

Member
Which means that the V/f curve isn't where they want it to be. They're doing an awful lot of new shit all at once. MCM, new node, completely rearchitected shader design and associated physical design, so things probably didn't go entirely as planned.
As I've said though, they are still getting spookily close to the 4090 in terms of rasterisation with a significantly smaller GPU, with lower than expected clocks. They realised its not as fast as they'd originally designed, so they've priced accordingly, which is perfectly fine.

I would hazard a guess that they will probably respin it with some optimisations like they did with RV570 back in the day, or like how Nvidia did with Fermi between the 480 and 580. And then release that for the 7950XT and XTX.
Navi 33 is next up, but its monolithic, on a node they have a lot of manufacturing experience with - so the physical design should in theory, not be too bad. And given how small and lightweight it should be, should be able to fly in terms of clockspeed.
Navi 32 is very far out at this stage, so I imagine a lot of lessons learned from N31 will be taken back into it before it tapes out and launches.

This is all just guesswork, but based on some reasonable assumptions and information that is partially out there already.

Is it a guess that n33 comes before n32 or is that a thing I missed? Because that would be great. 6600xt is a good card and I'd love to see what a 7600xt can do.
 

rnlval

Member
Take it up with AMD, they the ones that supposedly said RDNA 3 exceeds 3GHz.
CK1U4RM.jpg


Who knows, maybe they have a 7950XT that exceeds that 3GHz mentioned.
2.5 GHz with 96 CU (12,288 shader processors) has about 61 TFLOPS.

3.0 Ghz with 96 CU (12,288 shader processors) would land about 73.7 TFLOPS

3.3 Ghz with 96 CU (12,288 shader processors) would land about 81.1 TFLOPS

Increasing the clock speed would also increase BVH RT cores and 96 MB Infinity cache.
 

hlm666

Member
What will the AI improvements bring? did anyone talk about that? is it for FSR? Physics?
I don't think there was alot to actually say about it. The AI improvements are just an extra bfloat16 instruction per clock. it doesn't look like dedicated matrix math hardware.

AMD%20RDNA%203%20Tech%20Day_Press%20Deck%2070_575px.png
 

Crayon

Member
The thing on the slide that says 3ghz would have to be an outright mistake. Maybe the reference is made from the wheeziest chips to get that price down and just the very best ones can get over 3. That would be a big spread but why not. It makes sense in it's own way to make the reference the vanilla model and get some use out of those chips at the same time.

Like mentioned a few posts above, you think they'd have wanted to show these powerful aib's. Then again they chose to finish the show with the price when they really should have led with it so maybe it's a case of thinking best for last is a good approach when it's not. That announce would have been better leading with the price and it would be better to see 15% faster aib's right off the if they existed.
 

hlm666

Member
I swear I heard them mention Xilinx AI cores.
First i'm hearing about that. You would think if they had them in the gpu they would have been more vocal about it so with the silence and what the slide deck footnotes say about it i'm gonna take it at face value that it's just the extra fp16 instruction.
 

SlimySnake

Flashless at the Golden Globes


RT seems fine perfomance, especially when drivers will add like 10-15% perfomance

how the fuck can they be so far behind in RT performance with the addition of new RT cores bumping the RT performance by 50%?

The 6000 series didnt have any RT cores and simply adding more CUs and running it at higher clocks i.e., more tflops got you more RT performance. The 7000 series has 2.5x more tflops... 1.7x better performance, actual dedicated RT cores and they still top out at 1.5x? wtf is going on here?

Might as well just add more CUs and push the tflops instead of wasting die space on something that makes no real difference.
 
Last edited:

Irobot82

Member
First i'm hearing about that. You would think if they had them in the gpu they would have been more vocal about it so with the silence and what the slide deck footnotes say about it i'm gonna take it at face value that it's just the extra fp16 instruction.
Xilinx A.I. engine uses adaptive machine learning to improve the readability of small texts and menus.

That's what they said during the presentation.
 

Haint

Member
Which means that the V/f curve isn't where they want it to be. They're doing an awful lot of new shit all at once. MCM, new node, completely rearchitected shader design and associated physical design, so things probably didn't go entirely as planned.
As I've said though, they are still getting spookily close to the 4090 in terms of rasterisation with a significantly smaller GPU, with lower than expected clocks. They realised its not as fast as they'd originally designed, so they've priced accordingly, which is perfectly fine.

I would hazard a guess that they will probably respin it with some optimisations like they did with RV570 back in the day, or like how Nvidia did with Fermi between the 480 and 580. And then release that for the 7950XT and XTX.
Navi 33 is next up, but its monolithic, on a node they have a lot of manufacturing experience with - so the physical design should in theory, not be too bad. And given how small and lightweight it should be, should be able to fly in terms of clockspeed.
Navi 32 is very far out at this stage, so I imagine a lot of lessons learned from N31 will be taken back into it before it tapes out and launches.

This is all just guesswork, but based on some reasonable assumptions and information that is partially out there already.

Ok, but I was responding to suggestions that the AIB 7900XTX's launching on Dec 13th (such as the triple 8 Pin TUF pictured above) are actually going to hit 3Ghz and outperform the 4090.
 

iQuasarLV

Member
Everything else checks out, but wait for reviews/independend benchmarks, amd did some sheaneningans with their graphs this time(different cpu for different gpus) so just to be sure better wait, its not cheap impulse buy territory purchase.

Said all that even 20% lower raster in 4k vs 4090 when not cpu constrained will still be great result/amazing deal if its actually 1k bucks at retail(same thing here, dont compare msrp vs msrp but actual price u can get those cards online or at ur local area, might be way different vs msrp depending on where u live/if u are lucky with avaiability/buy early-, for example here in europe u cant get even worst aib 4090 models below 2500euro currently, hell dunno if u can get them below 3k euro even;/).
This is why I love Hardware Unboxed on YouTube. They always post their FPS/$ value charts and you get an awesome idea of just how much GPU you get per $ spent. If you honestly care about where your money goes that chart is divine.
 

iQuasarLV

Member
As another post reply I have questions about the 4000 series and 7000 series:

1. Are we truly going to embrace a market where we are just going to upscale from 1080-1440 images and interpolate frames into scenes just to push frame rate #s?
1a. Is this basically, "Native is dead. Long live fake native?"
1b. Is this what we have to swallow as 8k enters the room in 2023?
2. Are we now in a market where each vendor is going to copy pasta shittier techniques their competitors are doing just to say we have it to? Both Nvidia and AMD are equally guilty of doing this, don't delude yourself.
3. Is all this frame rate boosting technology all PR spin just to convince the buying public to embrace raytracing when these cards just are not powerful enough to push it?
4. WIth 8k now beginning to replace 4k as the top echelon of images in 2023, and these GPUs have to resort to shenanigans to output 60+fps at 4k. What the hell are we to expect with 8k resolutions? 1440p upscaled 900%(math checks out) to push that resolution?
5. Honestly do we think DLSS or FSR is going to hide 9x worth of upscaling?
6. How much BS are we going to swallow in the name of high-resolution FPS before we collectively call out these companies on their bullshit and stop buying these ridiculously priced products?
7. Is the MCM design just a way to throw bad silicon that didn't cut the mustard at the userbase instead of eating the cost, ala Intel with the Evo series? (God I hope it is the reverse where smaller chips = more refined use of space and less loss of silicon per wafer, but I would be remiss to not ask the question.)

I feel as though I am just a jaded old gamer whose 27 years of following this market is starting to catch up with them.
 
Last edited:

PeteBull

Member
As another post reply I have questions about the 4000 series and 7000 series:

1. Are we truly going to embrace a market where we are just going to upscale from 1080-1440 images and interpolate frames into scenes just to push frame rate #s?
1a. Is this basically, "Native is dead. Long live fake native?"
1b. Is this what we have to swallow as 8k enters the room in 2023?
Its all priorities, some1 can get 4k60fps stable easily, at maxed settings in diablo2 ressurected as a rtx 3080ti user(like myself- only 1 area ingame, or rather 1 unique mob/boss, shenk in early act5 just before waypont where game is terribly optimised/slows down badly and even crashes sometimes)- then the person can do it w/o any compromises aka soaping image quality.
Sometimes ofc gpu isnt as powerful/game is more demanding or some1 wants dat 4k 120fps or 165fps goal, then they gotta compromise, simply, kinda like on consoles, even with crossgen games, like amazing GoW Ragnarok u can have 4k30fps with better settings, and few modes that are higher fps/res up to 80-120fps mode that is 1440p only with reduced settings.

Oh- and forget about 8k- 8k monitors/tvs are very expensive even for most of us enthusiasts and gpus(yes, even super powerful for todays standards 4090), cant run many new games in native 8k(and whats the point running tetris in 16k 480fps anyways;p), its 4x pixel amount vs 4k(and obviously 16x full hd), and keep in mind one thing- crossgen games period ends soon, by 2023 u will have many titles that run on current gen only, those will have much higher system requirements, cpu/gpu/vram wise too.

So tldr- 8k=marketing hype, play at a resolution and framerate is capable off and u feel it gives u most pleasant experience- if its 1080p 240fps- so be it, if u want 1440p144hz, cool too, and even 4k60fps(like i do) works fine, if some1 isnt into competetive multiplayer genre :)

All those upscaling methods are actually good news, some1 can play 4k120fps max settings on his 4090 and 13900k(or other top cpu), while other person, not even casual but with much lower budget, can run same game on medium settings in stable 1080p60fps on i3 12100kf(105$ cpu)/r5 5600(120$ cpu) and rx 6600(215$ gpu) and have relatively similar fun :) here comparision between those gpus(and others) https://www.techpowerup.com/gpu-specs/radeon-rx-6600.c3696

Edit: https://pcpartpicker.com/list/mZK7gb just quick build with a pricetag incase some1 wants to get good entry lvl but by no means weak/outdated machine, good time to buy rigs now.
 
Last edited:
how the fuck can they be so far behind in RT performance with the addition of new RT cores bumping the RT performance by 50%?

The 6000 series didnt have any RT cores and simply adding more CUs and running it at higher clocks i.e., more tflops got you more RT performance. The 7000 series has 2.5x more tflops... 1.7x better performance, actual dedicated RT cores and they still top out at 1.5x? wtf is going on here?

Might as well just add more CUs and push the tflops instead of wasting die space on something that makes no real difference.

Well, they were so far behind before that a 50% increase isn't going to boost you that far. Pulling themselves up to 3080 level performance isn't bad though.
 
Ok, but I was responding to suggestions that the AIB 7900XTX's launching on Dec 13th (such as the triple 8 Pin TUF pictured above) are actually going to hit 3Ghz and outperform the 4090.
Let's wait and see I guess

how the fuck can they be so far behind in RT performance with the addition of new RT cores bumping the RT performance by 50%?

The 6000 series didnt have any RT cores and simply adding more CUs and running it at higher clocks i.e., more tflops got you more RT performance. The 7000 series has 2.5x more tflops... 1.7x better performance, actual dedicated RT cores and they still top out at 1.5x? wtf is going on here?

Might as well just add more CUs and push the tflops instead of wasting die space on something that makes no real difference.

The 4090 has 128SM's with a RT core in each, with a huge amount of transistor budget thrown at it.
The 7900XTX has 96CUs with an RT core in each, with very few transistors used for it. Even if the RT cores were perfectly equivalent in performance; AMD have 25% fewer of them.

You have to understand, AMD are working with 1/10th the R&D budget of Nvidia and Intel for both CPU and GPU.
The money and secured revenue streams they've built off the back of Ryzen are only going to come into fruition now. RDNA3 was design complete in 2020/21.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Let's wait and see I guess



The 4090 has 128SM's with a RT core in each, with a huge amount of transistor budget thrown at it.
The 7900XTX has 96CUs with an RT core in each, with very few transistors used for it. Even if the RT cores were perfectly equivalent in performance; AMD have 25% fewer of them.

You have to understand, AMD are working with 1/10th the R&D budget of Nvidia and Intel for both CPU and GPU.
The money and secured revenue streams they've built off the back of Ryzen are only going to come into fruition now. RDNA3 was design complete in 2020/21.
Yeah, but thats why i said they shouldve looked into adding more cores or pushing more clocks. Why are they being conservative? Money? Just pass the costs on to the consumer like nvidia did? who told them to release their flagship card at $999? nvidia has been releasing $1,200+ cards since 2018.

This is probably some kind of architectural issue. They are probably unable to get a linear performance increase out of more than 96 CUs.
 
Last edited:

poppabk

Cheeks Spread for Digital Only Future
Coming to this thread from the Sony VR reveal is giving me real whiplash.
- $550 for an OLED VR headset, have Sony lost their minds, they are taking the piss with their pricing.
- $500 price difference is nothing.
The PC space has gotten insane since streaming took off and someone had the smart idea of putting glass panels on cases. My entire PC build -5800x, 6900xt cost a little over $1k and I thought that was insanely extravagant and people are looking at a couple hundred dollars like it's a rounding error.
 

Sanepar

Member
Let's wait and see I guess



The 4090 has 128SM's with a RT core in each, with a huge amount of transistor budget thrown at it.
The 7900XTX has 96CUs with an RT core in each, with very few transistors used for it. Even if the RT cores were perfectly equivalent in performance; AMD have 25% fewer of them.

You have to understand, AMD are working with 1/10th the R&D budget of Nvidia and Intel for both CPU and GPU.
The money and secured revenue streams they've built off the back of Ryzen are only going to come into fruition now. RDNA3 was design complete in 2020/21.
I don't understand this RT argument. Even a 3090 ti can't handle most games with rt native 4k@60.

What is the point of RT to play on low res or 30 fps?

There is no viable hw for RT yet. With ue5 games next year and next gen games probably even a 4090 will not delivery 60 fps on rt games.

Rt for me is gimmick to tank perf. The only real benefit of it would be global ilumination with RT, but we are miles away of this reality on majority of games.
 

SolidQ

Member
Some people calculate on hardware forums, if Amd would be make 128CU at 3ghz, it's would be easily beat Rtx 4090 in raster an RT almost on par, but seems Amd repeat RV770 history.
 

thuGG_pl

Member
I don't understand this RT argument. Even a 3090 ti can't handle most games with rt native 4k@60.

What is the point of RT to play on low res or 30 fps?

There is no viable hw for RT yet. With ue5 games next year and next gen games probably even a 4090 will not delivery 60 fps on rt games.

Rt for me is gimmick to tank perf. The only real benefit of it would be global ilumination with RT, but we are miles away of this reality on majority of games.

Most of the games offering RT also offers DLSS/FSR. And most of the people use them because they work awesome and basically give free FPS.
Sure, if you're some kind of PC snob that won't lower yourself to using DLSS (because of reasons), then thats your problem. But actually many people can make use of RT today.
 

OZ9000

Banned
I'm obviously not gonna buy a card that cost 1000 dollars for 1500 euros.

1100 euros is the max i can go and i'm alredy stretching...

And i need to see a lot of non-manufactured cherry picked benchmarks in 4k with and without fsr3 and with ue5 engine demos before even thinking about spending that amount of money on a 7900xtx.
I've just set a mental limit of £1k to buy a GPU. To be honest I always felt £500-600 was 'too much' to buy a GPU just to play some video games but here we go.
 

Fess

Member
Coming to this thread from the Sony VR reveal is giving me real whiplash.
- $550 for an OLED VR headset, have Sony lost their minds, they are taking the piss with their pricing.
- $500 price difference is nothing.
The PC space has gotten insane since streaming took off and someone had the smart idea of putting glass panels on cases. My entire PC build -5800x, 6900xt cost a little over $1k and I thought that was insanely extravagant and people are looking at a couple hundred dollars like it's a rounding error.
I don’t see what streaming taking off has to do with anything. But besides that, sadly this is spot on. I just built a new PC like a month ago, the totals when you build a complete highend PC setup with screen and all is easily 3-5k USD, and then you’re not having the top parts.
 
agreed - almost pointless to buy the XT at $100 less.
Why? People have budgets. High end graphics cards hasn't been in my budget since the 8800 series, but back then prices were way lower. That being said I had a set amount I was willing to spend. If I could get a lower or mid grade high end card to meet my budget number then I would go with that. So I could defiantly see people in that position.
I could see the argument that if you are spending near a grand then you aren't hurting and money isn't a problem. I guess, but not everyone goes by that, especially when you have bills to pay.

I buy mid range gpus. I have a 3060ti. I could of gone for a 3070 for $100 more, but that wasn't in my budget. (not counting power draw and annoying 3 slot cards)
I see nothing wrong with having different cards for different segments of the market.

What I don't like is the constant obsession with only showing high end and mid tier and mainstream gaming cards get the shaft. People like myself in that market have to wait months, usually now years, for any cards.
I honestly don't know who has a grand to spend on graphics when inflation is what it is. I know I can't afford these or nvidia prices.

Prices for all segments are way too high. I recall buying a high end voodoo5 for $399, a 8800gts for $399, and a 1060 for $250. Back then high end stopped usually around $599.
 
Last edited:
XTX price is great. Cheaper than 4080. Can’t wait to see some benchmarks.

XT price is disappointing though
NO. Prices are shit. Why do you all accept skyhigh prices? Gamer quality gpus shouldn't be more then $700 for high end. We let them , both companies get away with this for too long. You don't need 4k 300fps to enjoy gaming. They need to be taught a lesson and humbled. They used to compete for our dollar and for mid and low end too , now they have their mining cash cow, and rich kids who live at home and single people who don't have a spouse spending 1000s on a gpu for games that don't even require them.
 
Why? People have budgets. High end graphics cards hasn't been in my budget since the 8800 series, but back then prices were way lower. That being said I had a set amount I was willing to spend. If I could get a lower or mid grade high end card to meet my budget number then I would go with that. So I could defiantly see people in that position.
I could see the argument that if you are spending near a grand then you aren't hurting and money isn't a problem. I guess, but not everyone goes by that, especially when you have bills to pay.

I buy mid range gpus. I have a 3060ti. I could of gone for a 3070 for $100 more, but that wasn't in my budget. (not counting power draw and annoying 3 slot cards)
I see nothing wrong with having different cards for different segments of the market.

What I don't like is the constant obsession with only showing high end and mid tier and mainstream gaming cards get the shaft. People like myself in that market have to wait months, usually now years, for any cards.
I honestly don't know who has a grand to spend on graphics when inflation is what it is. I know I can't afford these or nvidia prices.

Prices for all segments are way too high. I recall buying a high end voodoo5 for $399, a 8800gts for $399, and a 1060 for $250. Back then high end stopped usually around $599.
Note that 399$ back in 2000, when the Voodoo5 was released, are equivalent to 689$ now after inflation. nVidia prices are insane, but they're 300% increase insane, not 600% increase insane.

I'm likely getting a XTX if they keep the price promise, but I'm wary of the eurotax..
 

Warnen

Don't pass gaas, it is your Destiny!
Brother gonna try to get the xtx for his amd system to replace a 3080ti (well really a 3070ti since 3080ti will go in that system). Wish him luck, 4090 was a pita to get.
 

Nvzman

Member
NO. Prices are shit. Why do you all accept skyhigh prices? Gamer quality gpus shouldn't be more then $700 for high end. We let them , both companies get away with this for too long. You don't need 4k 300fps to enjoy gaming. They need to be taught a lesson and humbled. They used to compete for our dollar and for mid and low end too , now they have their mining cash cow, and rich kids who live at home and single people who don't have a spouse spending 1000s on a gpu for games that don't even require them.
Bro 7900 XTX is not a gamer quality GPU. It's AMD's super high end GPU equivalent to a titan or 4090/3090. $1000 for that is pretty fair.
The 7800XT will likely be $300 cheaper minimum and will be the more reasonable gaming-class GPU.
 

Loxus

Member
Some people calculate on hardware forums, if Amd would be make 128CU at 3ghz, it's would be easily beat Rtx 4090 in raster an RT almost on par, but seems Amd repeat RV770 history.
Yeah, AMD are weird sometimes.

Xdae6BC.jpg

The 7900XTX has 16 CUs per Shader Engine for a total of 96 CUs. If they did 20 CUs per Shader Engine like the 6900XT, it would of been 120 CUs.

Clock 120 CUs @ 3GHz would yield much better results for the top end card. It would run at higher watts and have a bigger cooler, similar to the 4090.
 
I'm one of the few in this forum, that stated that getting to high expectations for RDNA3 was a mistake, that would only lead to disappointment. At a time when people were hyping 2.5 performance increases for RT. And other non-sense.
Don't try to put me in the group of AMD fanboys. I'm neither for AMD nor NVidia, nor Intel. I'm for my wallet. And I'll buy the best bang for buck.

Now, you are correct, RDNA3 vs Ada Lovelace is almost a repeat of the RDNA2 vs Ampere fight.
But that's why AMD has priced RDNA3 cards much lower than NVidia.
They are not priced much lower than nVidia. Im fairly certain the 4070 will outperform these cards at a lower price.
 
Last edited:
They are not priced much lower than nVidia. Im fairly certain the 4070 will outperform these cards at a lower price.

Comparing the provided benchmarks to the same titles on 4090 and it looks more like a situation where the 3080 is going to have a tough go of it in traditional rasterization. That 12GB 4080 would have been absolutely murdered. RT will still go Nvidia's way though.

Obviously, AMD could be really cherry-picking the results. We'll just have to wait and see how the 4080 and the 7900 series compare.
 
Last edited:

winjer

Member
They are not priced much lower than nVidia. Im fairly certain the 4070 will outperform these cards at a lower price.

From the data we have at this point. The old 4080 12GB, would perform close to the 3090 Ti, both is rasterization and RT. If the 4070 is to be the old 4080 12GB, then that should be it's performance.
But from the data we have for the 7900XTX, it should perform close to the 4090 in rasterization. And close to the 3090 Ti in RT.
 

Crayon

Member
Why? People have budgets. High end graphics cards hasn't been in my budget since the 8800 series, but back then prices were way lower. That being said I had a set amount I was willing to spend. If I could get a lower or mid grade high end card to meet my budget number then I would go with that. So I could defiantly see people in that position.
I could see the argument that if you are spending near a grand then you aren't hurting and money isn't a problem. I guess, but not everyone goes by that, especially when you have bills to pay.

I buy mid range gpus. I have a 3060ti. I could of gone for a 3070 for $100 more, but that wasn't in my budget. (not counting power draw and annoying 3 slot cards)
I see nothing wrong with having different cards for different segments of the market.

What I don't like is the constant obsession with only showing high end and mid tier and mainstream gaming cards get the shaft. People like myself in that market have to wait months, usually now years, for any cards.
I honestly don't know who has a grand to spend on graphics when inflation is what it is. I know I can't afford these or nvidia prices.

Prices for all segments are way too high. I recall buying a high end voodoo5 for $399, a 8800gts for $399, and a 1060 for $250. Back then high end stopped usually around $599.

The way that is dragged out now, makes it so the last generation stuff has to fill it in. 6600's and 6700's and 6800's are available at good prices. Lower prices than the 7000 equivalents are going to come out at. And they perform great. A 6600xt is under 300 and performs like a 3060/ti aside from raytracing. A 7600 might be $100 more and run 50% better than today's 6700.

Also, cards are fast now. You could run that 6700 for a long time, only compromising on the trickle of next gen games that has yet to pick up pace. Even then being able to play them with FSR2 since will support it. It's not like it's outdated. It makes enough sense to price drop immanently last gen cards and let them usefully service the mid and low range while they clear out. Kinda annoying to wait tho...
 

DonkeyPunchJr

World’s Biggest Weeb
Bro 7900 XTX is not a gamer quality GPU. It's AMD's super high end GPU equivalent to a titan or 4090/3090. $1000 for that is pretty fair.
The 7800XT will likely be $300 cheaper minimum and will be the more reasonable gaming-class GPU.
Yup exactly. $1000 for their flagship “Titan” competitor is not bad. I’m more disappointed that neither they nor Nvidia have anything (yet) that’s priced for mainstream or even high-end. This must be the first GPU gen where they both launched with only the enthusiast tier.
 

Crayon

Member
The interview with the AMD guy purported to say something about aibs overclocking, I thought it was kind of ambiguous. The only hint was that he started perking up and getting into it when talking about it.
 

poppabk

Cheeks Spread for Digital Only Future
The way that is dragged out now, makes it so the last generation stuff has to fill it in. 6600's and 6700's and 6800's are available at good prices. Lower prices than the 7000 equivalents are going to come out at. And they perform great. A 6600xt is under 300 and performs like a 3060/ti aside from raytracing. A 7600 might be $100 more and run 50% better than today's 6700.

Also, cards are fast now. You could run that 6700 for a long time, only compromising on the trickle of next gen games that has yet to pick up pace. Even then being able to play them with FSR2 since will support it. It's not like it's outdated. It makes enough sense to price drop immanently last gen cards and let them usefully service the mid and low range while they clear out. Kinda annoying to wait tho...
I picked up a 6900xt for under $600. Would like the extra power for FS2020 VR but just can't justify the price nor the wait for the lower end cards.
 

PaintTinJr

Member
Yeah, but thats why i said they shouldve looked into adding more cores or pushing more clocks. Why are they being conservative? Money? Just pass the costs on to the consumer like nvidia did? who told them to release their flagship card at $999? nvidia has been releasing $1,200+ cards since 2018.

This is probably some kind of architectural issue. They are probably unable to get a linear performance increase out of more than 96 CUs.
My theory is that both AMD and the consoles are all in on evolving software RT; especially PlayStation, and that when these benchmark with software lumen the fidelity gap between hardware lumen and software lumen will be closing significantly - with it still being a loss, but less so when using the 7900XTX dedicated hardware RT for foreground lumen too.

My hunch is that only with software RT (signed distance fields) can high-frame RT games happen, because hardware RT is latency heavy by comparison effectively being more passes, so, AMD's long term bet is software RT evolving, and more CUs and higher clocks - with better TDP - closing the visual delta and because it is done in a single shader call (as Epic UE5 team stated) the AMD higher IPC will make them the goto card for high resolution, high frame-rate with RT - that's good enough that it is a diminishing return to use higher latency dedicated hardware RT cores that Nvidia and Intel are betting on.
 

SlimySnake

Flashless at the Golden Globes
My theory is that both AMD and the consoles are all in on evolving software RT; especially PlayStation, and that when these benchmark with software lumen the fidelity gap between hardware lumen and software lumen will be closing significantly - with it still being a loss, but less so when using the 7900XTX dedicated hardware RT for foreground lumen too.

My hunch is that only with software RT (signed distance fields) can high-frame RT games happen, because hardware RT is latency heavy by comparison effectively being more passes, so, AMD's long term bet is software RT evolving, and more CUs and higher clocks - with better TDP - closing the visual delta and because it is done in a single shader call (as Epic UE5 team stated) the AMD higher IPC will make them the goto card for high resolution, high frame-rate with RT - that's good enough that it is a diminishing return to use higher latency dedicated hardware RT cores that Nvidia and Intel are betting on.
The thing is that the 6800xt performs the same as the 3080 in the matrix demo. Epics lumens seems to be designed around the console implementation and doesn’t give a boost to rtx cards like other ray tracing games.
 

Loxus

Member
The thing is that the 6800xt performs the same as the 3080 in the matrix demo. Epics lumens seems to be designed around the console implementation and doesn’t give a boost to rtx cards like other ray tracing games.
Maybe it's a case where AMD RT is good but current game engines are made for NVidia RT implementation, making AMD implementation perform poorly.

Maybe your right and UE5 is much more optimized when running on AMD hardware because of the consoles.
 

HoofHearted

Member
Why? People have budgets. High end graphics cards hasn't been in my budget since the 8800 series, but back then prices were way lower. That being said I had a set amount I was willing to spend. If I could get a lower or mid grade high end card to meet my budget number then I would go with that. So I could defiantly see people in that position.
I could see the argument that if you are spending near a grand then you aren't hurting and money isn't a problem. I guess, but not everyone goes by that, especially when you have bills to pay.

I buy mid range gpus. I have a 3060ti. I could of gone for a 3070 for $100 more, but that wasn't in my budget. (not counting power draw and annoying 3 slot cards)
I see nothing wrong with having different cards for different segments of the market.

What I don't like is the constant obsession with only showing high end and mid tier and mainstream gaming cards get the shaft. People like myself in that market have to wait months, usually now years, for any cards.
I honestly don't know who has a grand to spend on graphics when inflation is what it is. I know I can't afford these or nvidia prices.

Prices for all segments are way too high. I recall buying a high end voodoo5 for $399, a 8800gts for $399, and a 1060 for $250. Back then high end stopped usually around $599.
Apologies if it wasn't clear before - My previous comment was purely contextual based on the pricing between these two specific cards only.

IMHO - *all* cards are priced too high for the market ... NVIDIA in particular has lost their minds in this respect.

I'm *guessing* (and hopeful) that we may soon see a trend back to reasonable price points for GPUs - but time will tell. That's what makes this upcoming release interesting.

Given the choice between the 7900 XT/X and 4080/90 - unless NVIDIA reduces their prices - and the benchmarks provide evidence that these new AMD cards come close to 4090 - then I (and I expect many others here) will most likely switch to AMD for this gen due to the price point.

Unfortunately - it seems people are willing to pay a premium price for GPUs.

That being said - my point between these particular cards is that, for the price difference, you might as well pay $100 for the higher end card. I'd rather see the 7900 XT at $150-200 less than the XTX. But generally speaking - if I'm already making the decision to invest $900 in a card - then it's not really that much of a reach to invest another $100 for the higher end card, especially if it yields > 10% gains with respect to price/performance.

In your example above - you're referencing mid-tier cards - so the comparison is quite different. A 3060Ti is around the same performance of a 3070 - and at the lower price point - certainly makes sense to save $100 in that regard (I'd do the same).
 
Is it a guess that n33 comes before n32 or is that a thing I missed? Because that would be great. 6600xt is a good card and I'd love to see what a 7600xt can do.
N33 is coming Q1 next year, because its also AMD's big push into mobile GPU. Laptops always get refreshed at CES, and they need a die ready for that.
 
Top Bottom