• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RTX 4070 Review Thread

Senua

Gold Member
Especially considering what we've had in the past...even makes the wet fart of a 2070 look decent!

3jlZiFk.png
 

Gaiff

SBI’s Resident Gaslighter
no hes fucking right this shit should be 369, anything that isnt higher than 599 dollars. 3080 performance for 3080 price is fucking stupid, not to mention it's still fucking humungous like all the other 4000 series cards are so microatx users can't purchase it even if we wanted
No, he isn't. You ain't getting that performance for $370 in 2023. That was the launch price of the 1070 back in 2016.
Just a few years ago, a xx70 range GPU would be priced at 300-350$.
We are being price gauged, hard.
"Just a few years ago" by that do you mean 10 years ago? The last $350 x70 card was the GTX 970 in 2014, almost 10 years ago. Before that, it was the GTX 570, 13 years ago.

This should be priced $450. It would be an absolute steal at $300.

I'll call out NVIDIA's price-gouging as much as the next man but come on, you want a card that's 80% faster than what's inside the consoles for $300? You'd be able to build a rig that's almost twice their performance for like $600.
 
Last edited:

winjer

Member
No, he isn't. You ain't getting that performance for $370 in 2023. That was the launch price of the 1070 back in 2016.

"Just a few years ago" by that do you mean 10 years ago? The last $350 x70 card was the GTX 970 in 2014, almost 10 years ago. Before that, it was the GTX 570, 13 years ago.

This should be priced $450. It would be an absolute steal at $300.

I'll call out NVIDIA's price-gouging as much as the next man but come on, you want a card that's 80% faster than what's inside the consoles for $300? You'd be able to build a rig that's almost twice their performance for like $600.

Even the GTX 1070 was just a bit over 350. At 379$ it was still a great card, and it wasn't that long ago. Just 2016.
We are now paying double for the same type of card.
 

Gaiff

SBI’s Resident Gaslighter
Even the GTX 1070 was just a bit over 350. At 379$ it was still a great card, and it wasn't that long ago. Just 2016.
We are now paying double for the same type of card.
That was still 7 years ago. According to the inflation calculator, $370 in 2016 is equivalent to $460-470 today. Now I know that inflation affects different markets differently but this is just a general figure. As I mentioned, before, it should be in the $450 ballpark. Probably a bit lower because that's weak x70 card relatively speaking so maybe $420 or so but $380?
 

winjer

Member
That was still 7 years ago. According to the inflation calculator, $370 in 2016 is equivalent to $460-470 today. Now I know that inflation affects different markets differently but this is just a general figure. As I mentioned, before, it should be in the $450 ballpark. Probably a bit lower because that's weak x70 card relatively speaking so maybe $420 or so but $380?

Now with those prices I would agree.
 

CrustyBritches

Gold Member
Best bang-for-buck(cost per frame) of all current gen cards, losing only to the 7900XT at 4K, but this is a 1440p card. There's probably still an argument to be made for the value of 16GB RX 6000-series cards like 6800, 6800XT, 6950XT, but it should be noted some of those cards can experience transient spikes around 600W, while the 4070 only hit around 235W spikes. It's hard to tell the value of DLSS 3 at this performance tier, with some games seemingly benefiting from it, while others the performance increase is minor. Should be ok with it's 12GB VRAM since the 4070Ti does ok in games like RE4R and TLOU1.

It is effectively a 3080 with 12GB VRAM and DLSS 3, but with lower power consumption. It comes in at $100 below the 3080's $699 msrp, but it needs to be stated that the 3080 spent much of it's time above the $1K mark, and even after the crypto crash stayed around $800. During the crypto surge it wouldn't be unusual to see 3060Ti at $850, or a 3060 at $750. This is what skewed the market along with component shortages. So if you're looking for better bang-for-buck, the RX 6000-series 16GB are available. You can get a 6950XT for $610 right now.
 

Hot5pur

Member
Glad I got a 3060Ti and a 3080 10 GB last gen at MSRP for 2 PC builds. 1440P and 4K systems. Would have been better to get the 3080 12 GB but meh.

The gen on gen improvement is not as good this time price/performance, perhaps 30xx was just an exception on recent times.

The 4070 is a fairly practical option given everything else. Don't know about the whole VRAM thing, seems lazy console ports have problems in part due to large VRAM pool, but then again if you play on high and not ultra you will probably be fine though the launch of the Pro consoles (late 2024 or 2025?).

If I was upgrading today I'd just go for the 4070 for a decent 40k60 system, assuming using DLSS and a few settings turned down (especially tracing)
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I used to always look forward to xx70s because they were almost always bang for buck kings, and could take on lastgen range toppers.

This xx70 is having trouble beating the base xx80 of last gen?
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Bring on the 4080'20G.

The 4080 is too weak compared to the 4090 and the the 4070Ti just isnt worth it.
xf1TnXd.png
 

Killer8

Member
The FPS per dollar metric becomes kind of meaningless as time goes on, because nobody ever bothers to adjust for inflation. Makes any discussion about it completely pointless. The real news is how pathetic a percentage improvement this card is over the previous iteration though.
 

Irobot82

Member
The 2070 launched at $499 4.5 years ago. And since then silicon prices for Nvidia have likely doubled, while they are shipping with 50% more memory using a proprietary technology only made by Micron.
It's also went from 12nm to 5nm and the 70's series die has shrunk meaning more dies per wafer.

My 1080 is seriously crying when I try to run games.
I guess I'm looking at this or a 7800XT, but I'm getting tired of waiting on AMD to do anything.
 
Last edited:

FireFly

Member
It's also went from 12nm to 5nm and the 70's series die has shrunk meaning more dies per wafer.
True. Using a die calculator, I make it 149 vs 103 "good dies" (assuming same defect rate). So if 5nm wafers are double the price, overall chip costs have increased by ~40%.
 

marjo

Member
600 for 12gb Vram, i think i will pass. Maybe next gen. This is actually an rtx 4060 with the price of a 4070 card.
Agreed. Though I would say this is more like a xx60 card performance wise, but with the price of a xx70 TI one. Absolute garbage.
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
Even the GTX 1070 was just a bit over 350. At 379$ it was still a great card, and it wasn't that long ago. Just 2016.
We are now paying double for the same type of card.
7 years ago is a long time ago. You don't see console gamers whining because consoles (with a disc drive) are now $500 instead of the $300 current gen consoles were going for back in 2016.

$600 isn't anywhere near double what 1070 was at launch. When 1070 launched you couldn't get the thing for the MSRP of $380, most cards were over $400 and the FE was $450... that's where Nvidia fake MSRP meme originated...

Realistically it's a 30-50% price increase ($400-$450 launch 1070s), but when you factor in inflation, it's less than that.

Lastgen cards are all looking super amazing right now.
Last gen cards only look good if you either don't care about RT performance (RDNA2) or don't mind the massive power draw (compared to RTX 4070) for 4070 like performance, or don't care about DLSS3.

I'd easily take the RTX 4070 over any last gen card because of it's efficiency and RT performance and DLSS3.

Hard to believe AMD is struggling so hard in the GPU space.
The 6000 series should be much much higher than they currently are on the Steam Hardware Survey.
Given how AMD is competing, it's not really all that surprising, ever since RDNA1 AMD has barely undercut Nvidia in price while offering worse features.

The reason the RX 570/580 sold decently (the two highest ranking AMD cards on Steam Survey) was because those were regularly selling for $140/$200, well under the equivalent Nvidia cards.

That was before RT, that was before DLSS, that was when 8 GB was a massive amount of VRAM, AMD had a lot going for them with the 570/580 (especially at the fire sale prices near the end). With all that it still wasn't a card I would use due to the power draw, but I could see the appeal.

Sadly, AMD has not had a card with such good value at the lower price brackets since.
 
Last edited:

MikeM

Member
Lastgen cards are all looking super amazing right now.

Hard to believe AMD is struggling so hard in the GPU space.
The 6000 series should be much much higher than they currently are on the Steam Hardware Survey.

If they can make a CUDA/OptiX competitor I think we might see their cards climb the charts.
Own a 7900xt. People need to get off the Nvidia mindshare. They are enabling this.

Space Force Im Doing My Part GIF
 
Last edited:

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
Own a 7900xt. People need to get off the Nvidia mindshare. They are enabling this.
For the past 3 generations I buy Nvidia for the features, power efficiency and RT. I'd only consider AMD if they were more powerful than Nvidia for less money4 (to make up for FSR2's abysmal 1440p showing).

Last time I bought an AMD card was before RT was a thing, and I ended up selling it because the power draw was too high even after under volting.


These type of comparisons will make the next RDNA3 card look DOA too, unless we are expecting AMD to have a $400 MSRP gap between 7900 XT and 7800 XT.
 

MikeM

Member
For the past 3 generations I buy Nvidia for the features, power efficiency and RT. I'd only consider AMD if they were more powerful than Nvidia for less money4 (to make up for FSR2's abysmal 1440p showing).

Last time I bought an AMD card was before RT was a thing, and I ended up selling it because the power draw was too high even after under volting.


These type of comparisons will make the next RDNA3 card look DOA too, unless we are expecting AMD to have a $400 MSRP gap between 7900 XT and 7800 XT.
What card do you run?
 

MikeM

Member
3070, probably going to upgrade to 4070 if I can get an MSRP card.
Cool- enjoy once you get it.

The only card from Nvidia I would remotely consider is the 4080. I would not be happy being limited on anything because of vram. Hence the 7900xt.
 
Performance is awesome at 1440p, but I wonder why Nvidia limits the amount of VRAM that 4070 GPUs can have. Personaly I would rather pay more than worry about VRAM two years later.
 

lmimmfn

Member
People seem to be shitting on the 4070 and that it's only close to 3080 performance for 100 dollars less but are excluding:
- it was extremely difficult to get a 3080 at launch and then the prices went insane
- 3080 only has 8GB VRAM, which is too low and was at the time and excluded me from being interested coming from a 1080Ti.
- it has better raytracing performance vs 3080
- inflation, its maybe 20% more expensive for everything vs 2021

It's a solid card and much needed to bring PCs to respectable costs(but prices will climb to 700 dollars once the artificial 100euro paid to AIBs) runs out.

If you can get a 16 pin 4070 ASAP then go for it as those prices will climb by 100euro.
 
Last edited:

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
Cool- enjoy once you get it.

The only card from Nvidia I would remotely consider is the 4080. I would not be happy being limited on anything because of vram. Hence the 7900xt.
I will, and I never ran into VRAM issues on my current 8 GB card at 1440p, so it's not something I'm too concerned about.

I would not be happy buying a 7900 XT at $900 only for it to drop $100 2 months later.
Performance is awesome at 1440p, but I wonder why Nvidia limits the amount of VRAM that 4070 GPUs can have. Personaly I would rather pay more than worry about VRAM two years later.
I never saw 8 GB as an issue in my 2 years of owning a 3070. I never ran into the issue in the games I play, though admittedly, I have not played any of the AMD sponsored titles which seem to be the only ones having issues.
 

MikeM

Member
I will, and I never ran into VRAM issues on my current 8 GB card at 1440p, so it's not something I'm too concerned about.

I would not be happy buying a 7900 XT at $900 only for it to drop $100 2 months later..
Meh- $100 is a drop in the bucket in my overall setup lol
 

Buggy Loop

Member
Bring on the 4080'20G.

The 4080 is too weak compared to the 4090 and the the 4070Ti just isnt worth it.
xf1TnXd.png

There’s such a huge gap between 4090 and 4080, it’s mind blowing.

4080 should have been 4070, 4080 somewhere in that gap range in the ~12k cores.

Really holding on until 5000 series, hopefully we return to a sense of normalcy (ha ha ha… ain’t happening, right?)
 
I will, and I never ran into VRAM issues on my current 8 GB card at 1440p, so it's not something I'm too concerned about.

I would not be happy buying a 7900 XT at $900 only for it to drop $100 2 months later.

I never saw 8 GB as an issue in my 2 years of owning a 3070. I never ran into the issue in the games I play, though admittedly, I have not played any of the AMD sponsored titles which seem to be the only ones having issues.
Good for you, but based on my own experience I'm sure 16GB 4070 model will last way longer than 12GB. I still can run games on my ancient GTX1080 (GPU made in 2016!) because it had insane amount of VRAM when I bought it (back then 8GB of VRAM was a lot, even 4GB was good enough).

If Nv isnt willing to sell 4070 models with 16GB I'm going to buy my first AMD GPU, because I would rather have more VRAM than DLSS3 (you cant even limit performance with DLSS3 because of insane input lag penalty, and fps fluctuation isnt good even on VRR monitor).
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
Meh- $100 is a drop in the bucket in my overall setup lol
That's only around 5% of the total cost of my desktop tower components but I'd still be annoyed about that since it happened so soon.

I'd also be annoyed that the XTX can easily be had today for only $50 more than XT original MSRP and offers a lot more performance.

These days I try to get the most for my money and I feel I get more for my money with Nvidia in the long run (as Nvidia mid-range cards tend to hold their value better in my experience).
Good for you, but based on my own experience I'm sure 16GB 4070 model will last way longer than 12GB. I still can run games on my ancient GTX1080 (GPU made in 2016!) because it had insane amount of VRAM when I bought it (back then 8GB of VRAM was a lot, even 4GB was good enough).

If Nv isnt willing to sell 4070 models with 16GB I'm going to buy my first AMD GPU, because I would rather have more VRAM than DLSS3 (you cant even limit performance with DLSS3 because of insane input lag penalty, and fps fluctuation isnt good even on VRR monitor).
As a 1440p gamer I just couldn't go AMD unless they drastically improve FSR2 at 1440p so that it became on par with DLSS2, and that's before we get into the fact that I enjoy turning on RT (another mark against AMD).

You must have to turn down loads of settings for the GTX 1080, and unusable RT. I could never keep a GPU for so long...
 

SatansReverence

Hipster Princess
An XX60 series card, marked as an XX70 series, priced as an XX80 card.

Thats a bold strategy Cotton.

Who am I kidding, of course people will buy this no matter how stupid it is.
 

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
NVIDIA in downfall finally
AMD is getting better and better happy about that
It's too bad AMD has no next-gen card to compete with the 4070 anytime soon. I wonder what's taking them so long, it's already been 4 months since the RX 7900 series launch yet AMD only released cards at $899 MSRP and above...
 
Last edited:

Buggy Loop

Member
If Nv isnt willing to sell 4070 models with 16GB I'm going to buy my first AMD GPU, because I would rather have more VRAM than DLSS3 (you cant even limit performance with DLSS3 because of insane input lag penalty, and fps fluctuation isnt good even on VRR monitor).

Tom Cruise What GIF


You'll want to trade... the ability to have not only playable framerates when the game chokes the CPU or is just RT heavy, and reflex, DLSS... so that the DLSS quality + Frame gen has a much lower latency than the native (reflex off) latency?

qRLYAFT.png


I guess it can depend on the game, but here. A whooping 10ms delay from what i guess is native + reflex

plague-latency-tests.png


Have you seen AMD latency?

fortnite-latency-4070-ti-perf.png


"We experimented with Radeon Anti-Lag here as well, which did seem to reduces latency on the Radeons by about 10 - 20ms, but we couldn't get reliable / repeatable frame rates with Anti-Lag enabled."

"Normally, all other things being equal, higher framerates result in lower latency, but that is not the case here. The GeForce RTX 4070 Ti offers significantly better latency characteristics versus the Radeons, though it obviously trails the higher-end RTX 4080."

But sure. Do that for 4 GB. Which one you're gonna buy? 7900 XT? 7900 XTX? Those are higher price tiers. Or RDNA 2 card?
 
Last edited:

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
Especially considering what we've had in the past...even makes the wet fart of a 2070 look decent!

3jlZiFk.png

Not really a fair comparison since the x80 used to actually be the mid-tier (104) chip from 900 series all the way up to the 20-series.

GP104 (1070) was 29% faster than full fat GM104 (980)
TU106 (2070) was 16% faster than full fat GP104 (1080)
GA104 (3070) was 26% faster than close to full TU104 (2080)
AD104 (4070) is 26% faster than close to full GA104 (3070)

Looking at it this way nothing has changed.

It's only changed because of the naming of the items. If Nvidia kept 900-20 series conventions the 3070 would have been a 3080, and the 3060 Ti would have been a 3070. 4070 Ti would have been a 4080 (which Nvidia tried but vocal PC gamers flipped out, yet AMD got away with much worse 6900/6950 XT went from competing with RTX 3080 Ti/3090 down to 7900 XT being on par with RTX 4070 Ti).

And now for some reason, certain people are calling the 4070 a 4060 :messenger_tears_of_joy: (if that's the case why is AMD going to compete with 4070 Series in a few months with a 7800 XT?)

I'll gladly buy the 4070 knowing it is performing as it should historically, compared to the prior gen 104 chip.
 
Last edited:

SmokedMeat

Gamer™
It's too bad AMD has no next-gen card to compete with the 4070 anytime soon. I wonder what's taking them so long, it's already been 4 months since the RX 7900 series launch yet AMD only released cards at $899 MSRP and above...

7900XT is $799. For an extra $200 it curb stomps the 4070 and will probably match the 5070 under cheapo Jensen. Hell, I lucked out and scored mine for $599.

Considering 6800XT manages to match and even pull ahead slightly of the 4070 means, a 7800XT should bitchslap it. Hell, the 6800XT will likely age better than the 4070 with its 12GB of VRAM.

That’ll be two Nvidia XX70 generations beaten by one AMD generation.

It’s a shame Nvidia hasn’t bothered to deliver good value since the 10XX series.
 
That's only around 5% of the total cost of my desktop tower components but I'd still be annoyed about that since it happened so soon.

I'd also be annoyed that the XTX can easily be had today for only $50 more than XT original MSRP and offers a lot more performance.

These days I try to get the most for my money and I feel I get more for my money with Nvidia in the long run (as Nvidia mid-range cards tend to hold their value better in my experience).

As a 1440p gamer I just couldn't go AMD unless they drastically improve FSR2 at 1440p so that it became on par with DLSS2, and that's before we get into the fact that I enjoy turning on RT (another mark against AMD).

You must have to turn down loads of settings for the GTX 1080, and unusable RT. I could never keep a GPU for so long...
My GTX1080 (10.7TF with OC) can still run the vast majority of games from my steam library with playable framerates at 1440p, and in the worst scenario I can always use FSR2 or lock performance to something like 40fps (40fps is already very playable on my VRR monitor, especially on gamepad). For example I have played The RE4 remake lately, and I only had to use FSR2 to play at 60fps (1440p, high settings), I really cant complain.

My GTX1080 started showing it's age, but only now when developers started targeting PS5 / XSX consoles. I can always lower some settings to lower GPU requirements, however if game require more than 8GB VRAM then there's nothing I can do about it. I saw how games like the hogwarts legacy and TLOU looks and run on 8GB's GPUs and there's no way I'm going to play with massive stutters or extremely low texture quality.

It's finally time to upgrade my GTX1080. Something like RTX4070 has the GPU power to run PS5 ports till the end of this generation, but I dont want to worry about VRAM two years from now. Look dude, you bought 3070 just two years ago and now some games already runs and looks like crap on this GPU, and not because your GPU is too slow, but because VRAM is the limiting factor.
 
Last edited:

Leonidas

AMD's Dogma: ARyzen (No Intel inside)
7900XT is $799. For an extra $200 it curb stomps the 4070
If I was spending $799 on a GPU I'd get 4070 Ti since it has better RT than 7900 XT.

At 33% more $ I'd hope 7900 XT is better than 4070, but sadly in RT the 7900XT loses in demanding RT games like Cyber Punk. And FSR is still useless at 1440p, compared to DLSS.

I'd never consider such a card. I'll keep buying the xx70 every time and getting a nice bump as long as prices remain reasonable.

Considering 6800XT manages to match and even pull ahead slightly of the 4070 means, a 7800XT should bitchslap it.
But at what price? Even if it's $599 and 15% faster in raster, it's still going to probably lose to the 4070 in RT and probably use more power, while still having abysmal FSR2 1440p image quality (unless FSR2 improves soon).

My GTX1080 (10.7TF with OC) can still run the vast majority of games from my steam library with playable framerates at 1440p, and in the worst scenario I can always use FSR2 or lock performance to something like 40fps (40fps is already very playable on my VRR monitor, especially on gamepad).
My Steam Deck plays the vast majority of my games at playable framerates. FSR2 image quality is abysmal at 1440p. 40 FPS is an abysmal frame-frate for desktop gaming. I use a high refresh monitor. I like running at around 100 FPS at least and would prefer maxing out my monitor at approaching 200 FPS.

It's finally time to upgrade my GTX1080. Something like RTX4070 has the GPU power to run PS5 ports till the end of this generation, but I dont want to worry about VRAM two years from now. Look dude, you bought 3070 just two years ago and now some games already runs and looks like crap on this GPU, and not because your GPU is too slow, but because VRAM is the limiting factor.
I usually buy a GPU every year. The 3070 at nearly 2 years is the longest I've ever held on to a GPU, I'm overdue an upgrade.

I'll probably have a 5070 in 2025. VRAM limits is the last thing on my mind. Nvidia will give the x70 the VRAM it needs when it matters.
 
Last edited:

MikeM

Member
That's only around 5% of the total cost of my desktop tower components but I'd still be annoyed about that since it happened so soon.

I'd also be annoyed that the XTX can easily be had today for only $50 more than XT original MSRP and offers a lot more performance.

These days I try to get the most for my money and I feel I get more for my money with Nvidia in the long run (as Nvidia mid-range cards tend to hold their value better in my experience).

As a 1440p gamer I just couldn't go AMD unless they drastically improve FSR2 at 1440p so that it became on par with DLSS2, and that's before we get into the fact that I enjoy turning on RT (another mark against AMD).

You must have to turn down loads of settings for the GTX 1080, and unusable RT. I could never keep a GPU for so long...
I’m assuming you live in the US with that pricing you are quoting. Worldwide pricing can vary by a lot more than that.
 
Tom Cruise What GIF


You'll want to trade... the ability to have not only playable framerates when the game chokes the CPU or is just RT heavy, and reflex, DLSS... so that the DLSS quality + Frame gen has a much lower latency than the native (reflex off) latency?

qRLYAFT.png


I guess it can depend on the game, but here. A whooping 10ms delay from what i guess is native + reflex

plague-latency-tests.png


Have you seen AMD latency?

fortnite-latency-4070-ti-perf.png


"We experimented with Radeon Anti-Lag here as well, which did seem to reduces latency on the Radeons by about 10 - 20ms, but we couldn't get reliable / repeatable frame rates with Anti-Lag enabled."

"Normally, all other things being equal, higher framerates result in lower latency, but that is not the case here. The GeForce RTX 4070 Ti offers significantly better latency characteristics versus the Radeons, though it obviously trails the higher-end RTX 4080."

But sure. Do that for 4 GB. Which one you're gonna buy? 7900 XT? 7900 XTX? Those are higher price tiers. Or RDNA 2 card?
DLSS3 has very small input lag penalty (at least compared to TVs motion upscalers) but you have to run your games with unlocked fps. If you want to lock your fps (for example with Riva Tuner) when DLSS3 is enabled then you will get additional 100ms penalty. Personally, I always play with fps lock because even on my VRR monitor I can feel when the fps is fluctuating.

You say that the 4GB VRAM difference is not big enough, but have a look at this, even the 6700XT destroys the RTX3070 thanks to the extra 4GB of VRAM.

 
Top Bottom