• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Radeon RX 7700 & RX 7600 Graphics Cards At Computex 2023

winjer

Member

What I can confirm, however, is that individual (!) AMD board partners will show a Radeon RX 7600 as a finished product at Computex, while other partners, who use both AMD and NVIDIA chips, are first exercising wait-and-see restraint. To paraphrase politely. The statement that you don’t want to produce anything just to satisfy production and where you don’t see any chances of profits then hurts a bit. There is no price basis for a Radeon RX 7700XT at the moment, because you could even slip into the loss zone because the target group shifts and the price shouldn’t fit anymore.

The Navi 33 GCD is expected to feature 2 Shader Engines and each Shader Engine has 2 Shader Arrays (2 per SE / 4 in total). This rounds up to 16 WGP's or 32 Compute Units for a total of 2048 cores which is the same core count as the Navi 23 GPU.
  • AMD Navi 33: 2048 Cores, 128-bit Bus, 32 MB Infinity Cache, 204mm2 GPU Die @6nm
  • AMD Navi 23: 2048 Cores, 128-bit Bus, 32 MB Infinity Cache, 237mm2 GPU Die @7nm
The GPU will come packaged with 32 MB of Infinity Cache, the same amount as the Navi 23 GPU, and across a 128-bit wide bus. First introduced on laptops as the Radeon RX 7700 & RX 7600 series, the Navi 33 GPUs will be aiming at the budget segment with prices between the $250-$350 US range.

Where is are the 7800? Igor's Lab proposes that they might not be profitable in the current market.
One thing has also become clear in the last few months: New graphics cards are no longer sold only by performance classes, but primarily by price classes. And when the broad middle class breaks away, as already described above, only target groups in the high-end and in the area between entry level and lower middle class remain. In order to sell cards like a rumored Radeon RX 7800XT profitably, you would have to move into the GeForce RTX 4070 range, which has just proven to be sufficiently resistant to buying. However, AMD cannot (and should not) become cheaper, because that would reduce its own profit extremely and virtually starve the board partners.

Computex: 30 May. - 02 Jun. 2023
 
Last edited:

winjer

Member
In other words, 7800XT is the 7900XT..

No. This is how the stack should look like:

GPUCompute UnitsBoost ClockMemoryTBPLaunch Date
Possible AMD Radeon RX 7000 Series
RX 7900 XTXNavi 31 XTX96 CUs2.5 GHz24GB 384b355WDecember 2022
RX 7900 XTNavi 31 XT84 CUs2.4 GHz20GB 320b315WDecember 2022
RX 7800 (XT)Navi 31 (?)~ 70 CUs16GB 256b
RX 7700 (XT)Navi 32 (?)~ 64 CUs
RX 7600 (XT)Navi 33 (?)~ 32 CUsJune 2023

 

hinch7

Member
Would suck if N32 (7800 XT) is cancelled. But if it is around a 6800 XT to 6900 XT (aka 4070) performance at around the same price as a 4070, then that's doomed to fail. Which only leaves N33 on 6nm and AD106/7 on 4N :/

What a crap generation for mid range and lower end buyers. Every release is clearly overpriced and/or underperforming for the cost and tier.

Heck even the 7900XT and 7900XTX underperforming according to AMD's own press release, from launch to actual release.
 
Last edited:

Buggy Loop

Member
A lot of talk about margins and profit but

Nvidia can’t have narrowed AMD’s lineup to a tee and priced (the high price) their cards to the point AMD can’t be cheaper. There’s an easy $100 less in those range.
 

SmokedMeat

Gamer™
Realistically where would a 7800XT fall, when a 6950XT can be had for $650?

It’s a good idea to focus on the midrange and low midrange, though.
 

PeteBull

Member
Realistically where would a 7800XT fall, when a 6950XT can be had for $650?

It’s a good idea to focus on the midrange and low midrange, though.
16 gigs of vram 7800xt for 500-550$ and around 200W tdp would be amazing deal and very strong competitor vs both rtx 4070 and upcoming rtx 4060/ti
 

SmokedMeat

Gamer™
16 gigs of vram 7800xt for 500-550$ and around 200W tdp would be amazing deal and very strong competitor vs both rtx 4070 and upcoming rtx 4060/ti

I would expect that $500 price for the 7700XT. The 6800XT recently dropped to $550, so I wouldn’t expect a brand new 7800XT to release that low.
 

hinch7

Member
16 gigs of vram 7800xt for 500-550$ and around 200W tdp would be amazing deal and very strong competitor vs both rtx 4070 and upcoming rtx 4060/ti
Doubt it. Both N31 SKU couldn't come close to the efficiency to AD103 and AD104. N32 won't touch AD104 for efficiency at the same performance level by that metric. At which point most people would spend the little extra for the name and get a 4070. Better software stack; RT, DLSS 3 etc. Except at that price point, not many want a 4060 tier card disguised as a 4070. And buying a mid jump again from a 6800XT at similar costs doesn't make sense.

Makes me think that going MCM was the major downside for RNDA 3 top SKU's. Losts of efficiency lost and clock speeds lowered to compensate. Plus bugs etc. Should've stuck with a larger monolithic die and clocked past 3ghz and Navi 31/32 might have stood a chance against RTX 4000 series.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
128 bit bus?

That shit better be 16GB of VRAM then Nvidia will really be shitting themselves(not really)......cuz if they are even dreaming of putting 8GB on the 7700 and 7600 might as well just bin them now.

If they are 8GB......Intel save us.
 

PeteBull

Member
Doubt it. Both N31 SKU couldn't come close to the efficiency to AD103 and AD104. N32 won't touch AD104 for efficiency at the same performance level by that metric. At which point most people would spend the little extra for the name and get a 4070. Better software stack; RT, DLSS 3 etc. Except at that price point, not many want a 4060 tier card disguised as a 4070. And buying a mid jump again from a 6800XT at similar costs doesn't make sense.

Makes me think that going MCM was the major downside for RNDA 3 top SKU's. Losts of efficiency lost and clock speeds lowered to compensate. Plus bugs etc. Should've stuck with a larger monolithic die and clocked past 3ghz and Navi 31/32 might have stood a chance against RTX 4000 series.
I agree here just if the leaks are confirmed and amd launches low end cards at 250-300$ msrp there is huge price gap from that to 800$ msrp 7900xt that cant be filled with rdna2 cards alone, especially 6950xt isnt great purchase for midrange buyers coz it is huge and got nasty tdp.

For ppl who pay their electricity bill and dont have huge room for big case/tons of cooling, that 320tdp is really a killer, 200-220tdp, so like rtx 4070 got, is really sweetspot that works great even if u got cheap midrange case and only 2 or 3 case fans- lets not go into undervolting here, even many enthusiasts dont do it, not to mention casual midrange gpu class buyer.

Lets wait and see how it turns out, its up to amd if they really wanna fight for that marketshare or dont give a damn and profit margins are top priority there =/
 

bbeach123

Member
AMD's msrp beyond stupid anyway . Not worth waiting/buying at launch .

Maybe after half a year after they dropped like 100-200$.
 
Last edited:

PeteBull

Member
AMD's msrp beyond stupid anyway . Not worth waiting/buying at launch .

Maybe after half a year after they dropped like 100-200$.
They just wanna milk their loyalist to the max, hence 900$ msrp launch for 7900xt, then after few months and terrible sales-they saw and made right decision that price drop is needed and that 800$ msrp seems more reasonable, altho ofc far from supergood deal.

For ppl playing in 1440p 20gigs of vram vs only 12gigs from 4070ti already makes it better buy, its additional 8gigs after all, especially important if u wanna keep ur card not for 1-2 years but at least 4-5years after launch.

https://www.techpowerup.com/gpu-specs/geforce-rtx-4070-ti.c3950 +10% performance vs 4070ti is nice too ofc but wont matter that much especially if we include dlss2/3 support for nvidia product.

Not talking about rt coz at that lvl rt performance doesnt matter, and i know what im talking about as an owner of 3080ti- u simply check it out/how game rt features look, but in the end u always prefer more stable fps/higher resolution.
 

hinch7

Member
I agree here just if the leaks are confirmed and amd launches low end cards at 250-300$ msrp there is huge price gap from that to 800$ msrp 7900xt that cant be filled with rdna2 cards alone, especially 6950xt isnt great purchase for midrange buyers coz it is huge and got nasty tdp.

For ppl who pay their electricity bill and dont have huge room for big case/tons of cooling, that 320tdp is really a killer, 200-220tdp, so like rtx 4070 got, is really sweetspot that works great even if u got cheap midrange case and only 2 or 3 case fans- lets not go into undervolting here, even many enthusiasts dont do it, not to mention casual midrange gpu class buyer.

Lets wait and see how it turns out, its up to amd if they really wanna fight for that marketshare or dont give a damn and profit margins are top priority there =/
Yeah if it wasn't for cost Nvidia have completely outdone AMD in just about every metric this generation. Like reviewers could say yeah AMD is much better value, fail to see the economic situation in countries in the EU and UK where energy prices are sky high. As is cost of living. If only the 4000 series cards were priced reasonably those are the cards the masses should be going for. But alas most of the stack are overpriced by at least $200 or more - which goes for 7000 series cards too.

In any case.. 6nm Navi 33 cards aren't going to move the needle in any way or form. Hopefully they'll price these cheap otherwise its going to a load of nothing burger releases, again. Which is kinda expected now lol.

Hopefully, if AMD can give us something decent this generation with the 7800 XT, if its still coming that is..
 

PeteBull

Member
Im bit confused here tho, i see ppl bashing 4070/ti left and right, for the price and price/perf but on the other hand we arent expecting/hoping amd launches something with much better offerings.

Its gotta be either or- nvidia milking midrange hard and then amd can swoop in with much better offering and get tons of marketshare while making srs buck, or- margins are really bad and thats the reason amd doesnt wanna/cant compete currently, no 3rd option imo.
 

winjer

Member
Battlemage can't come soon enough. ppl here keep acting like Intel doesn't exist even though they clearly do

Arch GPUs are on the store shelves and few people are buying them.
It got so bad, some stores in Japan are now offering a 750 with the purchase of an RTX 4090.
 

Buggy Loop

Member
Doubt it. Both N31 SKU couldn't come close to the efficiency to AD103 and AD104. N32 won't touch AD104 for efficiency at the same performance level by that metric. At which point most people would spend the little extra for the name and get a 4070. Better software stack; RT, DLSS 3 etc. Except at that price point, not many want a 4060 tier card disguised as a 4070. And buying a mid jump again from a 6800XT at similar costs doesn't make sense.

Makes me think that going MCM was the major downside for RNDA 3 top SKU's. Losts of efficiency lost and clock speeds lowered to compensate. Plus bugs etc. Should've stuck with a larger monolithic die and clocked past 3ghz and Navi 31/32 might have stood a chance against RTX 4000 series.

The AMD mid range will be monolithic

Yeah it’s a weird gen..
 

64bitmodels

Reverse groomer.
Arch GPUs are on the store shelves and few people are buying them.
It got so bad, some stores in Japan are now offering a 750 with the purchase of an RTX 4090.
Hopefully Battlemage offers amazing price to performance and even better RTX and DLSS support so that can change quickly. Nvidia quality for AMD prices seems like what they're attempting here and it seems like a steal
 

RagnarokIV

Battlebus imprisoning me \m/ >.< \m/
As someone who hasn't looked to AMD since HD4850, can someone give a quick TL;DR on their naming scheme and pricing? All I know about modern AMD GPUs is that if you wait a year then you're getting at least a 100 slashed off the price.
 

64bitmodels

Reverse groomer.
can someone give a quick TL;DR on their naming scheme and pricing?
virtually the same as nvidia except that the tier number is the second rather than the third. so RX 6600 = Nvidia 60 series, RX 6700 = Nvidia 70 series, so on so forth

AMD cards are usually weaker though, so you have compare down a tier. Like compare the 6800 to the 3070(ti), 6700 to 3060(ti), so on so forth. direct comparison between the 6600xt and 3060ti will always end in the 3060ti beating it in raster and rt

as for pricing, this guy has a great video on current gpus

 
  • Like
Reactions: GHG

PeteBull

Member
As someone who hasn't looked to AMD since HD4850, can someone give a quick TL;DR on their naming scheme and pricing? All I know about modern AMD GPUs is that if you wait a year then you're getting at least a 100 slashed off the price.
As always dont believe in msrp and graphs given by those companies/their marketing/pr/basically whatever they say, wait for independend benchmarks and compare it all to street price in ur area(or online price if u ordering gpu online).
About more info about particular gpu just use this, here u got basically every gpu launched from many years back with all the info needed, and avg performance compared vs other gpus( ofc game to game they can varry and by quite a lot, we talking avg here) https://www.techpowerup.com/gpu-specs/
 
Last edited:

Nvzman

Member
This is all up to Battlemage really to un-fuck the GPU market. As someone who owns an A770 16gb, Arc is actually genuinely good value now, the drivers are in a much better place and I have high hopes for second gen. If Intel has extremely aggressive pricing ($500-$600 4080 competitor, as its rumored their flagship will meet 4080-ish specs), and drivers consistently improve like they have been, I can see that sucking away a shitload of market share from both AMD and Nvidia, which would be the best for us, as it would force competition.
 

Spyxos

Member
Battlemage can't come soon enough. ppl here keep acting like Intel doesn't exist even though they clearly do
An Intel card would make me very nervous. There were rumors that they want to discontinue graphics cards completely. Yes I know it's no longer true, but that would always be in the back of my mind.
 

hinch7

Member
An Intel card would make me very nervous. There were rumors that they want to discontinue graphics cards completely. Yes I know it's no longer true, but that would always be in the back of my mind.
No need to be nervous. Though if buying legacy games I'd be wary as support isn't anywhere near as good as AMD or Nvidia. Intel do have a roadmap for Arc and they'll continue from Battlemage in Q1/Q2 2024, to Celestial in 2024+ (estimates being 2026). Battlemage will be high end. Celestial will be enthusiast level SKU and targeting flagship performance along the lines of top end GPU's from Nvidia and AMD.
Intel-Battlemage-Cellestial.jpg

Pic from https://videocardz.com/newz/intel-arc-celestial-gpus-to-target-ultra-enthusiast-gpu-market-in-2024

Battlemage rumored to have twice the cores 64 Xe cores (vs 32 of the A770) 3xthe L2 Cache (48MB vs 16MB), 3Ghz and on TSMC's 4nm. That L2 Cache and improvements to architecture should make the GPU fly in RT. Source for that rumor is Redgamingtech though so lol. Take with a megaton of salt and the rest. It does come in very late to the party though as its supposedly coming early/mid 2024.
 
Last edited:

FireFly

Member
Im bit confused here tho, i see ppl bashing 4070/ti left and right, for the price and price/perf but on the other hand we arent expecting/hoping amd launches something with much better offerings.

Its gotta be either or- nvidia milking midrange hard and then amd can swoop in with much better offering and get tons of marketshare while making srs buck, or- margins are really bad and thats the reason amd doesnt wanna/cant compete currently, no 3rd option imo.
It's not just a question of margins but of RDNA 3 not delivering good results for the transistors invested. IPC is barely any better at all and clock speed improvements are very modest. So you end up with a part that has 2.15X more transistors but is only 37% faster. The 7900 XTX should have been a full performance tier above the 4080 and that difference should have continued throughout the stack.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
That bodes well because the 6700xt was 12bg and the 6700 was 10gb.

On a side note, I welcome the xt and xtx monikers if they get rid of the "non-xt" naming.
Ohh jeez I forgot AMD sometimes cut the bus width on same chips, the 6700 Navi 22 was on a 160bit bus, the Navi 22 in the 6700XT was 192bit.

Okay then damn they might cut the bus width and we end up with a 12GB model.
 
Last edited:

tusharngf

Member
""However, AMD cannot (and should not) become cheaper, because that would reduce its own profit extremely and virtually starve the board partners.""


giphy.gif
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
7700 = Navi32
Pretty sure thats for the 7800XT they are scared of releasing.

Navi 31 - 7900s
Navi 32 - 7800s
Navi 33 - 77/76/75?

Nonetheless, see my the above post, AMD could use smaller bus widths on the Navi 33 to get to 12GB if they needed to.
Just if the max is 128bit thats likely 16GB for the 7700, the 7600 on 96bit could be 12GB.
 
Last edited:

GreatnessRD

Member
7800/XT won't come around until the 6950 XT is gone. I think its smart of AMD to focus on the 7700 and 7600 since they kinda backed themselves into a corner with the 7900 XT, lol. Just hope they can learn from the pricing mistake of the 7900 XT and offer a solid deal with decent VRAM for both cards. I ALMOST want to hold out on buying the 6700 10GB to see what the 7700 looks like, but I've put this HTPC build off long enough, so I think I'm going to bite the bullet.
 

SmokedMeat

Gamer™
Battlemage can't come soon enough. ppl here keep acting like Intel doesn't exist even though they clearly do

They had a very lackluster launch, that landed like a lead balloon. It’s been post launch driver gains that’s been making them more attractive for lower mid range/entry level.

I don’t think anyone’s forgotten they’re in the GPU space. We’re just waiting for stronger cards.
 

64bitmodels

Reverse groomer.
This is an unmitigated disaster. AMD can't even compete with RTX 4070 with an RDNA3 GPU.

Seems like RTX 4070 was a great deal after all :messenger_smiling_with_eyes:
just cuz the other guys are doing bad (supposedly, the cards arent even out yet) doesnt mean that nvidia's offerings are good either.


whatever satisfies your AMD hate boner though. Do you write for UserBenchmark by any chance?
 

SmokedMeat

Gamer™
This is all up to Battlemage really to un-fuck the GPU market. As someone who owns an A770 16gb, Arc is actually genuinely good value now, the drivers are in a much better place and I have high hopes for second gen. If Intel has extremely aggressive pricing ($500-$600 4080 competitor, as its rumored their flagship will meet 4080-ish specs), and drivers consistently improve like they have been, I can see that sucking away a shitload of market share from both AMD and Nvidia, which would be the best for us, as it would force competition.

You’re setting yourself up for disappointment if you want Intel to deliver the same level of performance as Nvidia’s $1200 GPU, for $500-$600.
 

Nvzman

Member
You’re setting yourself up for disappointment if you want Intel to deliver the same level of performance as Nvidia’s $1200 GPU, for $500-$600.
I mean the A770 16GB already delivers 3070 performance for nearly $200 less on quite a few new games. Its not unrealistic.
Additionally Battlemage is supposed to be 4nm just like Ada.
 
Last edited:

Nvzman

Member
a 200 dollar gap makes sense
a 600-700 dollar gap is being ridiculous
Well nvidia's pricing is fucking ridiculous to begin with lol. Its extremely obviously inflated pricing thats not representative of costs at all. The 7900xtx being nearly $250 cheaper and still being overpriced is evidence of this. Its very clearly possible, even then its just semantics, even $700 would be great value compared to the absolute bullshit market we have now.
 

SmokedMeat

Gamer™
I mean the A770 16GB already delivers 3070 performance for nearly $200 less on quite a few new games. Its not unrealistic.
Additionally Battlemage is supposed to be 4nm just like Ada.

In which games? Benchmarks I’ve seen the A770 isn’t anywhere near 3070 level.
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
Meanwhile they’re giving away $100 Steam cards to get people to buy a 4070, because nobody wants to buy a rebranded 4060ti for $600.

But sure, unmitigated disaster for AMD 😂
At least Nvidia tried. They gave the market a good option at $599.

AMD product stack is weird.

7900 XTX $1000 MSRP
7900 XT $900 MSRP
-
-
-
-
7600 XT $300? (probably going to be 8 GB too, an amount some here scoff at)

Where is the midrange?
 
Last edited:

Kataploom

Gold Member
128 bit bus?

That shit better be 16GB of VRAM then Nvidia will really be shitting themselves(not really)......cuz if they are even dreaming of putting 8GB on the 7700 and 7600 might as well just bin them now.

If they are 8GB......Intel save us.
Look, I don't think it's worth it upgrading to 7700 xt if I just a 6700 xt last holidays... But if it had 16GB which I think it won't and it's on last with 6800 xt damn... Why not?
 
Top Bottom