• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RTX 4070 Review Thread

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Roughly matching the 6950 XT in raster.


But 7800 XT would lose to 4070 in RT and 1440p upscaling and efficiency, making the 4070 better than 7800 XT to me in pretty much every way.

I need more RT performance more than I need 10% more raster.
The 6950XT matches the 4070 in RT.....atleast in Control which I consider the baseline, that PT from Cyberpunk is nextgen shit.
If we assume the advancements made in RT trickle down to the 7800XT, then it should outshine the 4070 in RT too.
So im not really sure what advantages the 4070 would have over it beyond CUDA/OptiX.

8HuRQkd.png



I8kNc8k.png
 
Last edited:

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
Ordered, suck it hiveminders :messenger_smiling_with_eyes:

12GB is the new 8GB. Barely scraping by, and will be forced to drop textures in another year or two.
My old 1070 traded blows with a 980ti. That was back when Nvidia delivered value.
The 4070 barely beats a 3080, and loses on average to AMD’s last gen 6800XT.
8 GB was never a problem for me on the 3070 in the ~two years I had it
Your old 1070 is just old at this point, yes it beat the 980 Ti, congrats, that's very old news.
I'm aware of 4070 performance, I read the reviews, it's the fastest raster GPU at 200 watts, it's the fastest RT GPU at 200 watts.

This is why Nvidia will continue to dominate. People are too stubborn to open their eyes to even consider alternatives. Jensen has you by the balls.
I'd rather form my own opinions than follow the hivemind "waaah GPU prices too high I'm going to stick with my 8 year old GPU waaaaah"

Saying AMD loses out in RT is amusing. Like it’s some enormous loss, when in reality it’s like 3090ti level. FSR difference isn’t even noticeable during gameplay.
At 1440p it is noticeable, FSR2 at 1440p is abysmal. Good luck running Cyber Punk path tracing on an AMD card.

I’ll gladly take two very minor losses in upscaling tech, over having to turn textures down to medium - and having worse rasterization. I want a card that delivers power without the need for the upscaling crutch.
I never had to turn down settings on my 3070 in nearly two years and now that card is old news, will put it up for sale as soon as the 4070 I just ordered arrives :messenger_smiling_with_eyes:

Here's hoping AMD gets FSR2 close to DLSS2 at 1440p in the future so that maybe I would consider them.

But that’s just me. I’m loyal to neither company, but I will absolutely go to bat for whomever I feel is doing things better. I will call out who I feel isn’t. Realistically I’m hoping Intel wrecks both in the future, and lights a fire under their asses.
In the meantime 4070 is mediocrity for $600. It’s missionary position in GPU form. No way would I buy that card in 2023.
Me either, believe it or not. I've used Intel, AMD and Nvidia in my rigs multiple times over the past 10 years.

I just buy the best GPU for the job and today that happens to be the RTX 4070.

The 6950XT matches the 4070 in RT.....
Even if 7800 XT does end up matching 4070 in RT it wouldn't be enough because FSR2 sucks at 1440p (my resolution) where DLSS2 looks good at 1440p. You have to turn on upscaling to get a good experience in RT games and I refuse to use FSR2 at 1440p.

7800 XT will get destroyed in Cyber Punk path traced.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Even if 7800 XT does end up matching 4070 in RT it wouldn't be enough because FSR2 sucks at 1440p (my resolution) where DLSS2 looks good at 1440p. You have to turn on upscaling to get a good experience in RT games and I refuse to use FSR2 at 1440p.
So if its all about DLSS vs FSR, then AMD was never an option at all......unless FSR3 really does improve alot.
You could have stopped right there instead round abouting into RT performance which the 7800XT will surely win, raster which it will win again, or anything else.

I dont put down AMD cards because they lack CUDA/OptiX, I know thats Nvidia tech, but looking at the card objectively, not subjectively from my perspective its hard to ignore they are right up there and assuming the 7800XT can beat the 6950XT, then its a better card than the RTX4070.

Arguing which upscaling tech is better well thats neither here nor there when empirically looking at these cards.
But I cant go AMD cuz I need CUDA and OptiX for work.....so unless AMDs HIP can improve alot, AMD will never be an option for me.......but if I was building a purely gaming rig, i for sure would be looking at AMD.
 

CrustyBritches

Gold Member
Leonidas Leonidas Congrats on the new card, but is a 4070 really a suitable upgrade from a 3070? I once side-graded from a R9 390 to a RX 480 and after selling the 390 I only paid ~$30 and thermal and power savings were definitely worth it. Just wondering what your thought process is on that.

I haven’t pulled the trigger yet on the 4070, but it’s probably the card I’ll upgrade to. One of my kids needed an upgrade so I passed on my desktop with the exception of case and PSU. At the moment I’m on a 3060 laptop, so I’d require a full rebuild except case and PSU. I’m using a 750W Fatal1ty PSU that has been rock solid for me, I’m just not sure if it can handle the transient spikes of ~600W from cards like 6950XT, which has me leaning towards a 4070 which looks like it only spikes to ~230W.

IIRC DLSS 3 is able to push through CPU-bound scenarios in shitty ports, so that appeals to me along with the quality of DLSS 2.x.
 
Last edited:

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
So if its all about DLSS vs FSR, then AMD was never an option at all......unless FSR3 really does improve alot.
You could have stopped right there instead round abouting into RT performance which the 7800XT will surely win, raster which it will win again, or anything else.
Yes FSR2 at 1440p is nearly a deal breaker for me. The only thing that could save 7800 XT in my eyes is the price... it would need to be cheaper for me to even consider and it would have had to already have released...

I dont put down AMD cards because they lack CUDA/OptiX, I know thats Nvidia tech, but looking at the card objectively, not subjectively from my perspective its hard to ignore they are right up there and assuming the 7800XT can beat the 6950XT, then its a better card than the RTX4070.
Better in raster sure, but in RT you pretty much have to turn on upscaling. I've used FSR2 at 1440p it's not good at that resolution.
If you don't care about RT AMD has been better for quite some time, nothing really changed there... but I do care about RT... and power consumption is also important to me.

Leonidas Leonidas Congrats on the new card, but is a 4070 really a suitable upgrade to a 3070? I once side-graded from a R9 390 to a RX 480 and after selling the 390 I only paid ~$30 and thermal and power savings were definitely worth it. Just wondering what your thought process is on that.

I haven’t pulled the trigger yet on the 4070, but it’s probably the card I’ll upgrade to. One of my kids needed an upgrade so I passed on my desktop with the exception of case and PSU. At the moment I’m on a 3060 laptop, so I’d require a full rebuild except case and PSU. I’m using a 750W Fatal1ty PSU that has been rock solid for me, I’m just not sure if it can handle the transient spikes of ~600W from cards like 6950XT, which has me leaning towards a 4070 which looks like it only spikes to ~230W.

IIRC DLSS 3 is able to push through CPU-bound scenarios in shitty ports, so that appeals to me along with the quality of DLSS 2.x.
I upgrade to some card every 1-2 years so maybe I'm not the best person to ask but these are my reasons for upgrading.

+25% performance
+I'm gaming at 1440p high refresh, a great resolution for 4070
+slightly less power
+DLSS3
+4GB VRAM

All that's probably going to end up costing me an extra $250 after I sell my 3070, for me that's worth it.

If I kept the 3070 I may have ran into 8 GB VRAM issues in the near future, something I don't want to run into.
 
He ain't wrong. It's not a killer deal like 1070, but it's a better deal than the 3070. Blame rampant money printing from our governments and TSMC's greed.

This is just the 4070 and 4090 of course. The rest of their jank shit can fuck right the fuck off.
 
Last edited:

Marlenus

Member
Dude, you must be joking. In 2012 I bought GTX680 2GB and two years later this GPU was extremely VRAM limited. I was forced to lower texture settings in pretty much every new game. In some games like COD Ghosts I had PS2 like textures no matter which texture settings I used.

NThb-O7y91-SIVB44-GQcl-Jun-FTCKl-GV-I48ogrxy-Sd-Pg.jpg


People learn on their mistakes, so in 2016 I bought GTX1080 with insane amount of VRAM (back then 8GB VRAM was lot). On the GTX1080 all textures in COD Ghosts loads correctly and the game is using up to 4GB VRAM, so no wonder I had problems on my 2GB GTX680.

The base GTX780 3GB model also had
insufficient VRAM, but people who bought 6GB model were probably set for the whole generation.

Nvidia can keep selling 4070 12GB, because for now 12GB is still good enough, but they should also offer 16GB model as well for people like me, who dont want to replace GPUs every 2 years. My old GTX1080 can still run games only because it had more than enough VRAM. The RTX4080 has plenty of VRAM and I' sure it can last the whole generation, but it's just too expensive.

Yes NV have a habit of only giving enough for right now. That means if you want a card to last buy the top one that comes out after the consoles launch or wait until the cross gen period ends and grab the more mid range card they release then.

I fully expect the 5060 to be 12GB and offer very good performance and that will be a good card for a while and will probably start to lose steam after the next gen consoles launch. The 5070 will probably have 16GB and that will also last until the next gen cross gen period ends.
 
The 7800XT is gonna be trading blows with the currently ~600 dollar 6950XT.
Im guessing part of the reason of the heavy price cut on the 6950XT is so its stock disappears and the 7800XT can take its spot easy work.

It will have to be sub 700 though to even make a mark.
With the new better RT performance itll not only be a true competitor for the RTX4070, realistically it should outdo the RTX4070 in every single way.
CUDA/Optix notwithstanding.




RTX3080s were such a good deal either way.
Upgraded to the RTX3080 because mining craze made people go crazy and were swapping 3070FHRs for 3080LHRs.......which ironically shortly after I did my swap LHR was basically defeated completely.

That's what I suspect as well.

The 7800 and 7700 are AMD's best chances it seems like. Especially the 7700, as it looks like Nvidia has no answer for a budget 1440p card in 2023 since everything they have left is 8GB or less (which isn't going to go well at 1440p as more and more current-gen only games start to release). AMD should really try to price that 7700xt as good as they possibly can.
 
Last edited:

StereoVsn

Gold Member
AMD cards aren't a great price at release but once they've had some cuts is when they start looking good. Those 16gb 6800's going for under $500 are looking good.
6950s can be had (in US)before not much over $600 as well. Also pretty darn good deal in these days.
 
Yes NV have a habit of only giving enough for right now. That means if you want a card to last buy the top one that comes out after the consoles launch or wait until the cross gen period ends and grab the more mid range card they release then.

I fully expect the 5060 to be 12GB and offer very good performance and that will be a good card for a while and will probably start to lose steam after the next gen consoles launch. The 5070 will probably have 16GB and that will also last until the next gen cross gen period ends.
No, I dont want to wait two years. My GTX1080 started really showing it's age in some of the latest games, and not even settings tweaks can help when the GPU is VRAM limited. TLOU1 looks like a joke on 8GB GPUs and there's no way I'm playing this game with such low quality textures (not even the original TLOU1 has such blurry textures like remake with medium textures).

If the RTX4070 cannot be equipped with 12GB VRAM because of 192-bit bus then I will just save some money and go for the RTX4080 because I want to be happy with my purchase (like I was with GTX1080) instead of regret it (GTX680). I'm sure RTX4080 will run all PS5 ports and UE5 games with ease even 5 years from now thanks to it's insane GPU power and 16GB of VRAM. If you really think about it, 1200$ is not that much if you plan to use GPU for the next 5 years. It's only about $20 a month and you're set for the whole generation.

The 4070 is a good purchase as well, but only for people who are willing to change their GPUs frequently.
 
Last edited:

Crayon

Member
Yeah idk.
6950s can be had (in US)before not much over $600 as well. Also pretty darn good deal in these days.

Screwed up thinking my next upgrade might have to be a $300 rdna 2 when rdna 4 is coming out just to get 16gb lol. Oh well first world problems and all.
 

PeteBull

Member
For ppl not liking/recomending rtx 4070 its best to show them hardware unboxed cost per frame charts, both at msrp and current streetprice, its pretty decent deal vs any other nvidia cards, and should be considered go to card for ppl with 600$ budget and has some fair advantages(some disadvantages too) even vs so prized rx 6950xt.

And one thing u cant accuse Hardware Unboxed of is being nvidia's shills, even they put the 4070 as top4 best value card on current market, and top1 nvidia card, since amd cards have worse features and power efficiency.
For all u ppl not liking the cards- its perfectly fine, but just tell me this- why didnt we get by now 600$ rdna3 gpu from amd, hell- why did they try to sell to us castrated rx 7800xt for 900$ and only quietly/unofficially lowered msrp to 800$ once noticed non existant sales?

Shouldnt amd sell much better value card(s) by now? and i dont mean even rt/dlss/fg features, but simply 6950xt 16gigs equivalent for 600 or even 550, maybe even 500$, since its such a bad deal for us to get 4070 there should be options from amd by now? and yet nothing...

And no, 6950xt isnt good enough, it has much higher tdp(335 vs 200 from 4070) and is much hoter/louder vs what true rnda3 card at 500-550$ should be, since its topend rdna2 card after all, heavily discounted ofc, from 1100$ launch price to current 650$.

U can make valid arguments amd wants/can milk their customers on cpu side, but here on gpu side they are under 11% of market share now, hell intel is above 6% now and nvdia above 82% , it should be amd's biggest priority to at least go back to 35-40% marketshare asap, or game devs will start treating their gpu's as black sheep, aka worse optimisation(less time/budget spent on that, since for such a small % of market its not worth it),
 

Verchod

Member
I bought a 3070 at launch, and even though it's still powerful enough, I felt then even when new that 8gb vram wasn't enough to last several generations. I didn't want to have to replace any time soon.
Now the 4070 has 12gb but again it only feels like just enough. Might be great this year, but a few years from now I think it'll suffer.
Seeing the price of this and the 4070Ti, I've decided to get a second hand 3090 for equivalent money. Similar performance but a ton vram
 

PeteBull

Member
Theoretically how would a 4070 fare for 4k60? With RT DLSS and all the bells and whistles? On current games ofc.
Badly ofc, think of it as of 3080 but with 12gigs vram, altho only 192bit bus width, so for 1080p its amazing/ even above 3080, then roughly on pair with it in 1440p and worse in 4k.

RT kills performance on 4070ti even, which is at least 15% faster(but same ram/ bus width so same rule apply here too, the more demanding/bigger res it runs- the worse it performs).
Forget about 4k or any kind of raytracing, if u wanna buy 600$ gpu like 4070 unless u mean older/non demanding games ofc, then its fine.

DLSS helps a ton here, both dlss2 and 3( but for 3 to work u still gotta have decent fps, so again- it wont make game that runs at 30fps look smooth at 60, all kinds of "artefacts" happening then and it doesnt feel smooth at all.

Its worth it to remember another thing- if u running game at 4k with quality dlss- its actually ai upscaling it from native 1440p, so image quality still looks good, not as good as native 4k but very decent still, unfortunately if u running games with rt turned on, for 4k quality dlss isnt fast enough, so u usually will have to go down to balanced, which is native 1080p upscaled by ai to 4k, then ofc u get massive boost in fps vs native 4k(4x less pixels to count) but image quality takes tremendous hit unfortunately.

Its a feature of dlss method- the higher res u run game at, both native and upscaled, the less image quality u lose coz simply AI has more data to guess/form correctly how it all should look.
 
I bought a 3070 at launch, and even though it's still powerful enough, I felt then even when new that 8gb vram wasn't enough to last several generations. I didn't want to have to replace any time soon.
Now the 4070 has 12gb but again it only feels like just enough. Might be great this year, but a few years from now I think it'll suffer.
Seeing the price of this and the 4070Ti, I've decided to get a second hand 3090 for equivalent money. Similar performance but a ton vram
Pretty much this. Sold my 2080ti couple of months back and chucked some extra cash on it for a 3090. Nice enough bump with plenty of VRAM.

For ppl not liking/recomending rtx 4070 its best to show them hardware unboxed cost per frame charts, both at msrp and current streetprice, its pretty decent deal vs any other nvidia cards, and should be considered go to card for ppl with 600$ budget and has some fair advantages(some disadvantages too) even vs so prized rx 6950xt.

And one thing u cant accuse Hardware Unboxed of is being nvidia's shills, even they put the 4070 as top4 best value card on current market, and top1 nvidia card, since amd cards have worse features and power efficiency.
For all u ppl not liking the cards- its perfectly fine, but just tell me this- why didnt we get by now 600$ rdna3 gpu from amd, hell- why did they try to sell to us castrated rx 7800xt for 900$ and only quietly/unofficially lowered msrp to 800$ once noticed non existant sales?

Shouldnt amd sell much better value card(s) by now? and i dont mean even rt/dlss/fg features, but simply 6950xt 16gigs equivalent for 600 or even 550, maybe even 500$, since its such a bad deal for us to get 4070 there should be options from amd by now? and yet nothing...

And no, 6950xt isnt good enough, it has much higher tdp(335 vs 200 from 4070) and is much hoter/louder vs what true rnda3 card at 500-550$ should be, since its topend rdna2 card after all, heavily discounted ofc, from 1100$ launch price to current 650$.

U can make valid arguments amd wants/can milk their customers on cpu side, but here on gpu side they are under 11% of market share now, hell intel is above 6% now and nvdia above 82% , it should be amd's biggest priority to at least go back to 35-40% marketshare asap, or game devs will start treating their gpu's as black sheep, aka worse optimisation(less time/budget spent on that, since for such a small % of market its not worth it),


Why should AMD bother? For what reason should they prioritise getting back market share? even if they had a compelling counter to the 4070 right now the 4070 would still slam dunk it into oblivion. Folk don't want AMD GPU's really, they just want competitive ones to try and bring Nvidia prices down so they can get cheaper Nvidia GPU's. AMD should stick to CPU's where they have some success instead of wasting resources with GPU's, the market has spoken.
 

PeteBull

Member
Pretty much this. Sold my 2080ti couple of months back and chucked some extra cash on it for a 3090. Nice enough bump with plenty of VRAM.



Why should AMD bother? For what reason should they prioritise getting back market share? even if they had a compelling counter to the 4070 right now the 4070 would still slam dunk it into oblivion. Folk don't want AMD GPU's really, they just want competitive ones to try and bring Nvidia prices down so they can get cheaper Nvidia GPU's. AMD should stick to CPU's where they have some success instead of wasting resources with GPU's, the market has spoken.
If it goes like that then we will experience nasty price hike from nvidia cards, both 4080 and 4090 are clear sign of that, so personally i would love for amd to at least try and compete, if no then u cant blame nvidia they wnna squeeze every dollar out of ppl with no competition around =/
 
If it goes like that then we will experience nasty price hike from nvidia cards, both 4080 and 4090 are clear sign of that, so personally i would love for amd to at least try and compete, if no then u cant blame nvidia they wnna squeeze every dollar out of ppl with no competition around =/

That's exactly my point. You want a competitive AMD not because you'll consider a card of theirs, but because you see it as something to keep Nvidia prices in check. Not sure how anyone is supposed to regain market share when that attitude is indicative of the GPU market as a whole, hence why should AMD bother. Honestly got no issue with Nvidia GPU prices myself, the market has spoken and enabled it so why not? I'd do the same
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
AMD cards aren't a great price at release but once they've had some cuts is when they start looking good. Those 16gb 6800's going for under $500 are looking good.
Under 500 dollars for a 16G 6800 sounds mad.
Like thats a legit steal for anyone building a machine right now.
Even with DLSS looking slightly better at 470 dollars thats price performance king shit right there.
Playing at 1440p you are sorted.

RX6800-OFFER-1200x492.jpg
 

Xellos

Member
My hope for 4070 was a $500 dual slot card with 3080-3080 ti type performance, 12+ GB VRAM and <250W power draw. The final product is close enough; 10% more performance or 10% cheaper would have been nice but it's not too far off. Noise and temperature seem great across the board. I have an old Fractal Node case with a strict 2-slot (~45mm) limit, so for me this is one of the better GPU options. Wish 4070 ti had more 2-slot options than that one Inno3D card.
 

Buggy Loop

Member
For ppl not liking/recomending rtx 4070 its best to show them hardware unboxed cost per frame charts, both at msrp and current streetprice, its pretty decent deal vs any other nvidia cards, and should be considered go to card for ppl with 600$ budget and has some fair advantages(some disadvantages too) even vs so prized rx 6950xt.

And one thing u cant accuse Hardware Unboxed of is being nvidia's shills, even they put the 4070 as top4 best value card on current market, and top1 nvidia card, since amd cards have worse features and power efficiency.
For all u ppl not liking the cards- its perfectly fine, but just tell me this- why didnt we get by now 600$ rdna3 gpu from amd, hell- why did they try to sell to us castrated rx 7800xt for 900$ and only quietly/unofficially lowered msrp to 800$ once noticed non existant sales?

Shouldnt amd sell much better value card(s) by now? and i dont mean even rt/dlss/fg features, but simply 6950xt 16gigs equivalent for 600 or even 550, maybe even 500$, since its such a bad deal for us to get 4070 there should be options from amd by now? and yet nothing...

And no, 6950xt isnt good enough, it has much higher tdp(335 vs 200 from 4070) and is much hoter/louder vs what true rnda3 card at 500-550$ should be, since its topend rdna2 card after all, heavily discounted ofc, from 1100$ launch price to current 650$.

U can make valid arguments amd wants/can milk their customers on cpu side, but here on gpu side they are under 11% of market share now, hell intel is above 6% now and nvdia above 82% , it should be amd's biggest priority to at least go back to 35-40% marketshare asap, or game devs will start treating their gpu's as black sheep, aka worse optimisation(less time/budget spent on that, since for such a small % of market its not worth it),


The power consumption is so fucking different lol, 6950XT avg 16% faster at 1440p (rasterization from techspot), for 64% more power consumption (+200W!)

Power1-p.webp


and not in the same form factor at all

HsdB6L5.jpg


In Unreal engine 5 with HW RT on, 2% difference @ native, i think this is where we'll the wave of upcoming games using UE5 set the tone of the remaining of this gen. Sprinkle DLSS (because it's better) and it's a no brainer for UE5 i think.

But not taking into account
  • FSR vs DLSS
  • Frame gen
  • NVENC AV1 encoder
  • SER and general RT lead over AMD with ray tracing and certainly even more so in path tracing tasks (hellooooo RTX remix wave of remasters coming!)
  • Better at VR
  • General Latencies/reflex are better on 4000 series
I mean,
You're on ampere/RDNA 2 already ? Skip the gen (unless you went balls deep with a 4090 you rich fuck)
You're on way older gen and somehow you want to upgrade but you find this gen a total shitshow? Buy a dirt cheap 6700 XT.
You're on way older gen and you can't wait for RDNA 3 mid-range to see if it shakes up 4070? Well buy the 4070 i would say. It's currently in the top tier for bang for the buck in this shitshow of a gen and is efficient with a small form factor, thermals and noise are good in reviews.
 
Last edited:

THE DUCK

voted poster of the decade by bots
I just want to jam one of these 30tf bad boys into a ms or Sony pro console and see what happens. $599?
 

twilo99

Member
The amount of performance you get out of that power draw is indeed impressive, almost as good as the xss ;)

Nvidia have done a great job on the power consumption for the 4xxx series cards, and I had a feeling Samsung held them back on that front for the last generation cards.
 

Marlenus

Member
No, I dont want to wait two years. My GTX1080 started really showing it's age in some of the latest games, and not even settings tweaks can help when the GPU is VRAM limited. TLOU1 looks like a joke on 8GB GPUs and there's no way I'm playing this game with such low quality textures (not even the original TLOU1 has such blurry textures like remake with medium textures).

If the RTX4070 cannot be equipped with 12GB VRAM because of 192-bit bus then I will just save some money and go for the RTX4080 because I want to be happy with my purchase (like I was with GTX1080) instead of regret it (GTX680). I'm sure RTX4080 will run all PS5 ports and UE5 games with ease even 5 years from now thanks to it's insane GPU power and 16GB of VRAM. If you really think about it, 1200$ is not that much if you plan to use GPU for the next 5 years. It's only about $20 a month and you're set for the whole generation.

The 4070 is a good purchase as well, but only for people who are willing to change their GPUs frequently.

Your 1080 was a great buy. If you don't want to wait 2 years for NV to stop skimping on VRAM then just go 4090. The performance delta Vs 4080 will give it far better legs as will the 24GB of vram.
 

MacReady13

Member
Starting to sell off some of my consoles to build a gaming PC. Will go with the 4070. I realise the 12 gigs won’t last forever but it is fine for now and I’ll probably upgrade the GPU in a couple of years once the 5000 series cards arrive. For now, it’s fine and will easily blow away current gen consoles which is all I’m after.
 

PeteBull

Member
I just want to jam one of these 30tf bad boys into a ms or Sony pro console and see what happens. $599?
There is gaf thread about it, conclusions from there are: gotta wait till 2024-2025 for smaller node and rdna4, then u can fit over 2x more powerful ps5pr0 into 220-250W tdp, price wise 600 to 800 bucks seems realistic.
 
Your 1080 was a great buy. If you don't want to wait 2 years for NV to stop skimping on VRAM then just go 4090. The performance delta Vs 4080 will give it far better legs as will the 24GB of vram.
The RTX4080 is slower, but:
-it's much cheaper
-it draws only 250W on average
-internal case temps are much lower compared to the 4090
-it should be dead silent under full load like my current GPU.

If I were going to play at 4K, I would probably go for the 4090, but for my needs, the RTX4080 is going to be more than enough. It has the GPU power to run every PS5 and UE5 game, and of course 16GB VRAM will make a difference for sure.
 

ToTTenTranz

Banned
That would make the 7800XT worse than a 6950XT?

At the very least I expect the 7800XT to match the 6950XT in raster and beat it in Raytracing.

The 7800XT doesn't need to perform better than the 6950XT. In fact, there's a big chance it won't.
RDNA3 CUs don't really provide better IPC than RDNA2, at least not at the moment (this might change if/when AMD's compiler makes successful use of the dual FP32 ALUs).
There will be 60 CUs on a 7800XT, or whatever they'll call a full Navi 32, and there are 80 CUs on a 6950XT.

On the current RDNA3 driver state, for the 7800XT to match the 6950XT in compute performance it would need to clock about 33% over the 6950XT (60CUs x 1.3(3) = 80). The second averages at 2400Mhz, so this means the 7800XT would need to average at 3200MHz. Which it probably won't.

As for raytracing performance, until the new instructions get proper use the IPC-per-CU should also be similar to RDNA2 as we see on the 7900XT/XTX, so it might not be better either.


TL;DR: The 7800XT is probably not going to match the 6950XT in performance unless significant driver optimizations are made. It's probably not going to match the RTX 4070 in power efficiency either, because it's using a cheaper / less optimized process node and because sending electrical signals between MCDs and GCD costs more power than having all the wires inside the same chip.

The good news is the chiplet architecture was developed to allow for cheaper solutions. The 7800XT doesn't need to match the 6950XT in performance to become a successful solution.
It just needs to bring a substantial improvement in cost/performance, which is what the RTX40 series have failed to do in all models but the 4090.
 

PeteBull

Member
The RTX4080 is slower, but:
-it's much cheaper
-it draws only 250W on average
-internal case temps are much lower compared to the 4090
-it should be dead silent under full load like my current GPU.

If I were going to play at 4K, I would probably go for the 4090, but for my needs, the RTX4080 is going to be more than enough. It has the GPU power to run every PS5 and UE5 game, and of course 16GB VRAM will make a difference for sure.
Just save for month or two longer and get urself full fat experience, aka 4090, that 400$ seems huge, but 1200 to 1600 is already such a high budget, might as well go for the best gpu with 24gigs of vram, i guarantee u wont regret it longterm.
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
:LOL: I get you are in the joke, nobody would suspect you would buy anything AMD makes over Intel or someone not being AMD if you can help it… hiveminders… a bit rich ;).
I had a Ryzen 1700X, Ryzen 3700X and Ryzen 5800X in the past...

I also have 4 consoles sitting on my desk (plus Steam Deck) with AMD inside...

I pick the best products for my use case, is that too hard for you to understand?

You don't suspect it because you blindly believe (like some others on this forum) that I'm somehow a fanboy.

I simply buy for my use-case.

No one here called me an AMD fanboy when I was running AMD CPUs, I wonder why...
 
Last edited:

Buggy Loop

Member
I love this guy's channel. Discovered him for small form factor PCs, but he's now one of the best tech tuber that doesn't make a surprised face every fucking video.

 

Vognerful

Member
I love this guy's channel. Discovered him for small form factor PCs, but he's now one of the best tech tuber that doesn't make a surprised face every fucking video.


I actually used his video reviews and builds for the Lian Li H2O A4. It helped me a lot as the it is a nightmare to have it as my first sff.
My RTX 4070 arrived today and I am really happy with it. I chose the ASUS duo and to my surprise, it actually comes with 8 pins instead of 16 pins like FE. Going from 2060s to this gave me 2X performance on RT without even enabling DLSS.

also, it is really quiet.
 

PeteBull

Member
looks like 4070 has really hard time selling and is dropping below msrp, at least in australia
 
Last edited:

hinch7

Member
looks like 4070 has really hard time selling and is dropping below msrp, at least in australia

Its the same in the UK. Plenty of stock of 4070's from the largest PC retailers here, still. Can actually find a Palit card for £556 including delivery.

At near £600, this just isn't going to fly for 3080 performance, in 2023. People want more and an actual generational leap and better value. Especially for a '70' tier card.
 
Last edited:

Buggy Loop

Member
I actually used his video reviews and builds for the Lian Li H2O A4. It helped me a lot as the it is a nightmare to have it as my first sff.
My RTX 4070 arrived today and I am really happy with it. I chose the ASUS duo and to my surprise, it actually comes with 8 pins instead of 16 pins like FE. Going from 2060s to this gave me 2X performance on RT without even enabling DLSS.

also, it is really quiet.

Have you tried undervolting it? Apparently it’s crazy efficient, you can shave off 60 Watts easy.
 
It's almost like the average gamer wants stay in that below $500 range. Who could have ever guessed.

If the rumored specs of the 4060/4060ti are to be believed, expect that thud to be even louder.
 

PeteBull

Member
It's almost like the average gamer wants stay in that below $500 range. Who could have ever guessed.

If the rumored specs of the 4060/4060ti are to be believed, expect that thud to be even louder.
yup 4060/ti if they launch with 8gigs of vram only will be really bad buy unless literally 0 price increase vs 30xx series ;/
 
Last edited:
Top Bottom