• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry The PS5 GPU in PC Form? Radeon RX 6700 In-Depth - Console Equivalent PC Performance?

yamaci17

Member
yes, I believe i linked the tweet of that Avatar tech director who said the same thing. Now tell me if comparing a 6.0 Ghz CPU with a 3.5 Ghz console CPU would not make a difference even at 50 fps instead of 100.
i don't know what is your point here, I never said otherwise. you must be mistaking others' opinions with mine and don't even make an attempt to understand what I'm trying to convey across. for me its a ps5 spec issue and poor hardware planning. should've waited end of 2021 and pushed zen 3 with at least 4 ghz. not rich's fault actually that you can get HFR experience on a cheap 13400f and 7600x. he knows this. you probably know this too. 6 ghz 13900k over a ryzen 7600x will barely make a difference for 4070 super. so while 6 ghz 13900k is not a real world scenario for the 4070 super, ryzen 7600 is. and if you agree with me that both CPUs will get 4070 super to those framerates comfortably, what is the point of trying to argue that 4070 super would not get those framerates in real use scenarios with more realistic CPUs, when indeed more realistic real use case CPUs such as 7600 and 13400f will get you %80 90 there.

cyberpunk-2077-1280-720.png


7600x goes for what, 220 bucks or something ? and it is %95 there at 720p on a 4090

so why is it such a big deal, really? do you realistically think owners of 4070 supers will match their GPUs with something lesser than 7600x? so even the lowest bound real use case scenario gets you there. so for me, it simply does not matter. it is a PS5 spec issue.

it is ps4 jaguar cpu versus PC desktop CPUs all over again. a run off the mill 100 bucks i5 ran circles around ps4 back then. you would literally have to go out of your way to find a CPU that could run as slow as PS4.

again though, I hope he uses the 200 bucks cheap 7600x just to prove a point going forward. then there shouldn't be any argument ground for you to stand on, hopefully (as you yourself admitted that you wouldn't have any issue with the review if he used 13400f. but you also know now that 7600x and 13400f gets you %80-95 there depending on the scenario. so i still do not get the obsession with 600 bucks part and 6 ghz part).

i personally do not see these reviews as pure GPU to GPU comparisons (which you seem to have). we can't have pure GPU to GPU comparisons if PS5 keeps getting bottlenecked anyways. LOL.

and where did you see rich thinking it is gpu bound or pure gpu difference? he often makes remarks about PS5 potentially being CPU bottlenecked here and there. you must be purposefully avoiding his remarks on that front if you make these type of comments. he's aware, and he's not necessarily saying 4070 or 6700 itself is "this much faster than ps5" but often makes remarks like "ps5 seems to have a cpu limitation here". he's not being disingenious, you people just refusing to see certain parts of reviews and inability to see the whole picture.
he's making those remarks where he "wonders" about performance is to let users know that it is an odd case where ps5 might be cpu bottlenecked. he's not dumb, he's just trying to make a point to casual userbase, which you are not, as a result, you take these comments either as dumb or hysterical and get this position of opposition towards him as a result. those comments are not targeted at people like you but people who genuinely have no clue what is going on.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
i don't know what is your point here, I never said otherwise.

This is what you said:

it is all about IPC/single thread performance which new generation CPUs have plenty of.

I agree with this. But I am pointing out the fallacy of saying single threaded performance is king and then conveniently ignoring the flawed methodology that led Rich to pair a 6.0 Ghz CPU when comparing a "GPU" thats paired up with a 3.5 GHz CPU.

I just want you to see that you and I are on the same page here in regards to single threaded performance.
 

Senua

Gold Member
again though, I hope he uses the 200 bucks cheap 7600x just to prove a point going forward.
I think if it was John or Alex they'd probably do that, but I doubt Rich even cares, and probably views forums like this as white noise. He does manage to trigger Gaf a lot, almost as much as Alex. Though triggering gaf is as easy as even mentioning something being more performant/powerful than Sony's machine. I guess he'll never know of the power he possess.

Season 3 Laughing GIF by The Simpsons
 

yamaci17

Member
This is what you said:



I agree with this. But I am pointing out the fallacy of saying single threaded performance is king and then conveniently ignoring the flawed methodology that led Rich to pair a 6.0 Ghz CPU when comparing a "GPU" thats paired up with a 3.5 GHz CPU.

I just want you to see that you and I are on the same page here in regards to single threaded performance.
we are, but in the end, I don't remember a time where Rich said 4070 is this faster than PS5 due to GPU power differences in performance modes. he makes those "i wonder" remarks to signify CPU bottlenecks usually. He can't say for sure, because none of us can actually, because none of us have performance profilers that can look into the renderer. Sure, you and I here can say "definitely CPU bottleneck", but all Rich can do, as a professional, is to make light statements like "it looks like a cpu limitation". he cannot use harsh words, he cannot criticise or slam down PS5. that would be unprofessional and at certain points of perspective, it would be probably rude. because then it would imply PS5 is a badly designed console which has unbalanced CPU and GPU (which is true). they even treaded lightly on PS4 reviews most of the time. they're ones who spoke the lighest on how horrible Jaguar cores are. it is just how their general stance towards the whole thing. when they see an outlier like this, they don't want to make harsh remarks as to why it happens. we're on the same page that it would happen with a mediocre cheaper CPU, at least, yes? ps5 is also paired with a mediocre cheap CPU of its time (zen 2 ones). it is just sad how this turned out to be. games shouldn't be this CPU bottlenecked but 30 fps being a baseline that most playerbase is accepting will sadly have this outcome

and in the end you and I are aware that in certain games, even ryzen 3600 outperforms ps5 massively. and that is a CPU that I would never pair 4070 super with imo. but as I said... there are also games like spiderman last of us and ratchet where PS5 outperforms the 3600 class of CPU. so its a mixed bag situation.

and again, I cannot myself say that avatar running at 720p and dropping to 50 fps is really a gpu or cpu bottleneck.



how much powerful 6700xt is than ps5 ? despite being at 1080p fsr quality (720p, the mythical 0.9 millison of pixels?) it is dangerously close to the 60 fps target (70-80)



and 6650xt does dip below 60 fps at 720p internal resolution, FULLY gpu bound.

so there's always a "what if " factor he has to have in his mind.
 
Last edited:

Zathalus

Member
Possibly but that'd be pathetic. Look at how it runs with old-ass CPUs and slow memory. Doesn't seem particularly CPU-intensive. On a 4090 by the way.

D6JMqGm.png

yes, I believe i linked the tweet of that Avatar tech director who said the same thing. Now tell me if comparing a 6.0 Ghz CPU with a 3.5 Ghz console CPU would not make a difference even at 50 fps instead of 100.
When a 2600 has 1% lows over 70 then no, I doubt the CPU is the cause of the drop. The likely cause is DRS not dropping low enough. The 13900k does 150-220fps in this game. The Zen 2 CPU is slower, but not nearly that much slower.
 

SlimySnake

Flashless at the Golden Globes
we are, but in the end, I don't remember a time where Rich said 4070 is this faster than PS5 due to GPU power differences in performance modes. he makes those "i wonder" remarks to signify CPU bottlenecks usually. He can't say for sure, because none of us can actually, because none of us have performance profilers that can look into the renderer. Sure, you and I here can say "definitely CPU bottleneck", but all Rich can do, as a professional, is to make light statements like "it looks like a cpu limitation". he cannot use harsh words, he cannot criticise or slam down PS5. that would be unprofessional and at certain points of perspective, it would be probably rude. because then it would imply PS5 is a badly designed console which has unbalanced CPU and GPU (which is true). they even treaded lightly on PS4 reviews most of the time. they're ones who spoke the lighest on how horrible Jaguar cores are. it is just how their general stance towards the whole thing. when they see an outlier like this, they don't want to make harsh remarks as to why it happens. we're on the same page that it would happen with a mediocre cheaper CPU, at least, yes? ps5 is also paired with a mediocre cheap CPU of its time (zen 2 ones). it is just sad how this turned out to be. games shouldn't be this CPU bottlenecked but 30 fps being a baseline that most playerbase is accepting will sadly have this outcome
I disagree. its his job to be harsh. hes literally the only console tech youtuber besides nx gamer. he gets first dibs on hardware reveals. He has direct access to mark cerny. His opinions WILL have an impact on the design of the next PS hardware and if he simply doesnt do his job and point out the flaws in the design, then the mistakes will be repeated. If not for the PS5, but for the PS5 Pro which will feature a much more powerful GPU but when paired with a 3.5 Ghz L3 cache starved CPU would effectively nullify any framerate increases they are aiming for.

I said this a few pages back. I hope they use the zen 4 CPUs with the higher clocks in the PS5 Pro because we have seen just how big of difference those higher clocks made in Starfield. And these are zen 2 and zen 3 CPUs that are way higher clocked then the PS5. The 2600x CPU is probably the closest to the PS5 CPu and even that is running at 4.0 Ghz. Is it really a surprise that the XSX didnt get a 60 fps mode? If the 2600x is averages 40 fps then the XSX would probably average 35 fps.

moz1aMb.png

HSYVNTi.png


4hKogzU.png
 

yamaci17

Member
When a 2600 has 1% lows over 70 then no, I doubt the CPU is the cause of the drop. The likely cause is DRS not dropping low enough. The 13900k does 150-220fps in this game. The Zen 2 CPU is slower, but not nearly that much slower.
when i get that game i will test that scene with my 3.7 ghz zen + cpu lol
 

MarkMe2525

Gold Member
To the few that are wondering what is the purpose of such comparisons, its for entertainment. You just have to remember, there is no spoon. If you keep that in mind, you will realize it's not the spoon that bends, but yourself.

Idk why this made me think of the Matrix, you are just going to have to give me a pass on this.
 

Gaiff

SBI’s Resident Gaslighter
I disagree. its his job to be harsh. hes literally the only console tech youtuber besides nx gamer. he gets first dibs on hardware reveals. He has direct access to mark cerny. His opinions WILL have an impact on the design of the next PS hardware and if he simply doesnt do his job and point out the flaws in the design, then the mistakes will be repeated.
That's exactly why he can't be harsh or overly critical and has to do this with kids' gloves. We can call him out all we want, but it's ultimately not our money that's on the line and he has to walk a thin line and strike a right balance between criticism and alienating his sponsors and industry contacts. You can't just ask him to shit on their partners like GN does.

What matters isn't how harsh he is, it's how truthful and accurate his results are.
 
Last edited:

64bitmodels

Reverse groomer.
Someone remind me the reason why folks are not able to buy APUs from AMD that are closer in graphical specs to that of current gen consoles?
Legit. Budget builds would see an insane resurgence if AMD just gave us these supercharged power efficient APUs. They'd make insane money from this.

Instead we're getting excited for 1060 equivalent performance while they're probably cooking a 3070 equivalent apu for the ps5 pro. Make it make sense.
 

winjer

Member
I said this a few pages back. I hope they use the zen 4 CPUs with the higher clocks in the PS5 Pro because we have seen just how big of difference those higher clocks made in Starfield. And these are zen 2 and zen 3 CPUs that are way higher clocked then the PS5. The 2600x CPU is probably the closest to the PS5 CPu and even that is running at 4.0 Ghz. Is it really a surprise that the XSX didnt get a 60 fps mode? If the 2600x is averages 40 fps then the XSX would probably average 35 fps.

moz1aMb.png

HSYVNTi.png


4hKogzU.png

One issue with the benchmarks you posted. They are from the release build, which wasn't well optimized for Zen.
With one of the patches that was solved and performance jumped by around 20%.

Starfield-p.webp
 

64bitmodels

Reverse groomer.
The difference in price isn't great, and you know the games will work probably perfectly
Will they work? Yeah, definitely.

Perfectly? lol

Either way, most pc ports now are doing fine. Its not as terrible of a situation as it was last year
 
Last edited:

64bitmodels

Reverse groomer.
And let's assume you can somehow get the OS for free.
Ignoring linux which is Foss, there are 16 billion different ways to get rid of the activate windows watermark. With how trivial it is to do windows may as well be free. The price of the os is a fucking lemon, irrelevant to the conversation
 

64bitmodels

Reverse groomer.
ones a few years old will not cope very well. Even though those same older GPUs did cope with current gen games a few years earlier. It's part of the PC tax, it's just a lot more expensive to keep up to date than it is with a console.
Gtx 1080ti. Still trucking along fine 7 years later. People are still using 2000 series gpus pretty easily. You're so unserious about this bro
 

yamaci17

Member
"I said this a few pages back. I hope they use the zen 4 CPUs with the higher clocks in the PS5 Pro because we have seen just how big of difference those higher clocks made in Starfield. " SlimySnake SlimySnake

they wont btw. best they can offer is a 4 ghz zen 2 cpu lol. sony and microsoft are afraid of cpu upgrades because if developer decides to target 30 fps for the zen 4 4.5 ghz cpu, zen 2 one will probably drop to 15-20s and that would be catastrophic (and downright agonizng for base console owners). this happened between ps4 and ps4 pro and im sure both sony and microsoft probably regretted doing that small cpu frequency boost upgrade. all it did was to give developers extra excuse to target 30 fps frametime with 2.2 ghz instead of 1.6 ghz which ended up with less stable 30 fps on ps4 if anything.
anyways you can see below how new atlantis runs on my poor 3.7 ghz zen + cpu with 16 mb cache (and xbox sx has a 3.6 ghz zen 2 cpu with 8 mb cache. zen 2 has less latency so it needs less cache than zen+ but actually had more cache compared to zen+, which is why it ended up being a decent performance bump over it. they both reduced cache latencies and increased cache amount at the same time resulting a noticable upgrade. so if anything, xbox sx in this scene would or should have no trouble hitting 60 fps at this point and time. not to mention console cpu has more bandwidth to play with considering starfield is not a gpu bandwidth heavy game)



(also take note that im freaking at 1440p dlss performance and gpu is still super saturated. which means xbox sx will have to drop below 720p internally to get 60 fps here. FYI)

i will test akila when i get there. but im sure things have improved there too. this region used to run at 36-40 fps on my end at launch. it is much better now.

you can see even at 1440p dlss performance, I often come near the GPU bound with my 3070. but 1440p dlss ultra performance finally makes GPU to reach %80 mark where you can say it is a clear cpu bottleneck.
then I used fsr quality + fsr 3 and boom 80+ fps. it really looks and works good in this game, i'd say go give it a try if you haven't so far. fsr 3 genuinely surprised me, especially with gamepad. as long as you haven 40-50 baseline framerate, latency is super acceptable with gamepad. for mouse and keyboard, 60 fps min is must. and consider that I was able to get 60+ fps in the majority of the combats? all of it now is being generated to 100+ fps which looks fantastic on my 144 hz vrr screen.
 
Last edited:

64bitmodels

Reverse groomer.
Yes. You can get shovelware accessories on pc of any kind.
They aren't shoveleare though. Third party controllers have been excellent for years now so long as you stay away from the 20 and under stuff. And the option to use Xbox and switch controllers too, that's nice.

notto mention that how good a controller is, it's subjective. I wouldn't spring for the dualsense as my first choice because of the dated asymmetrical layout & buttons which don't correspond with the abxy glyphs shown on pc.
 

Elysium44

Banned
Gtx 1080ti. Still trucking along fine 7 years later. People are still using 2000 series gpus pretty easily. You're so unserious about this bro

You know we were talking about your general claim that hardware good enough to run console games decently at the start of the gen (like a GTX 1060) would still be able to do so equally well at the end of the gen, which is already being proven false and we're only halfway through.
 

SlimySnake

Flashless at the Golden Globes
Another game, another native 4k 30 fps quality mode and this time an abysmal 720p internal 60 fps performance mode.

Timestamped:


pixel counts 8.2 million vs 0.9 million.

I wonder why these consoles are not GPU bound when pushing native 4k pixels and only GPU bound when pushing 720p.
 

Gaiff

SBI’s Resident Gaslighter
Another game, another native 4k 30 fps quality mode and this time an abysmal 720p internal 60 fps performance mode.

Timestamped:


pixel counts 8.2 million vs 0.9 million.

I wonder why these consoles are not GPU bound when pushing native 4k pixels and only GPU bound when pushing 720p.

We already have a thread for this one. Didn't you post in it?
 

SlimySnake

Flashless at the Golden Globes
We already have a thread for this one. Didn't you post in it?
Dont remember. They just posted this on the DF clips channel.

This belongs here because people here dont believe these console games are CPU bound at 60 fps.
 
Last edited:
Dont remember. They just posted this on the DF clips channel.

This belongs here because people here dont believe these console games are CPU bound at 60 fps.
Who is this we? I certainly believe a large majority of the performance modes are cpu bound even if by not enough to stop 60fps they certainly are cpu bound when targeting 120 and even in some 60
 
let's agree; but why should I care ? a cheap i5 13400f will have similar performance in those scenes. you're purposefully avoiding the points I made, while trying to potray things as extreme by constantly making remarks on "600 bucks cpu"

the CPU you people keep obsessing over won't have any meaningful difference over something like 13400f/14400f which is much much cheaper.

this 200 bucks cpu roflstomps this game



and its the minimum cpu you should pair a 4070+ and above GPU with. and that's the bottom line. i also wish they did test with 13400f myself, BUT JUST SO that you wouldn't be able to keep lamenting about the mythical "600+ bucks CPU" that some of you try to portray as the only way to get these GPUs working at high framerates

ps5 is bottlenecked or not, a 200 bucks run off the mill midrange i5 CPU is able to keep up with a GPU caliber of 7800xt/4070 in this game, which is a pretty cheap CPU by itself. 13900k or 13400f wont matter for 4070/7800xt class of GPU. 10400f to 13400f will matter, but 13400f to 13900k won't really matter here. it is all about IPC/single thread performance which new generation CPUs have plenty of.

why should I be mad about DF using an 13900k 600 bucks CPU when I can get %80-90 of the same performance for 200 bucks price with an i5????
sure they could do it but it would be a chore. Rich deep down knows that these tests would have almost a similar outcome were he to use a 13400f. so why should he even bother, lmao. anyone who pairs 4070 and above with a CPU worse than 13400f are doing themselves as a disservice anyways. that 13400f will also destroy much expensive 12900k or 11900k easily. it is all about single thread performance and it is really easy to access unlike flagship CPUs that cost a lot.

13900k's price doesn't even make sense.

Tbf it’s because we are aiming to compare the gpu and not the cpu do you really think a 13900k is not helping this card at least somewhat especially comparing to the ps5 performance modes?
 

yamaci17

Member
Another game, another native 4k 30 fps quality mode and this time an abysmal 720p internal 60 fps performance mode.

Timestamped:


pixel counts 8.2 million vs 0.9 million.

I wonder why these consoles are not GPU bound when pushing native 4k pixels and only GPU bound when pushing 720p.

you really have to stop with this pixel counts = performance correlation thing. games do not scale like you expect them to, MORE SO with upscaling.

8.2 millions of pixels, native 4k, rtx 3070, 28 FPS full gpu bound

zr935Ct.jpeg


4k dlss ultra performance, per your logic 0.9 millions of pixels, and... 58 frames per second. full GPU bound

FRR34wk.jpeg


don't you see what is being apparent here, really ? I can literally play this game at native 1800p-1900p and get 30 FPS all the time. But rendering internally AT 720P does not get me to 60 FPS.

and in case you think I'm magically somehow CPU limited here, here's the same scene at 1080p dlss quality, way above 70 FPS, so it was not CPU limited above (i mean stats don't lie anyways)

same 0.9 millions of pixels. as you can see it is shooting above 70 fps with %10 gpu headroom.
SLhi7gr.png


4k dlss ultra performance = 56 fps
1080p dlss quality = 75-80 fps


why can't you accept that GPU bound performance does not scale like you hope, wish or expect it to?

upscaling to 4k is heavy, upscaling to 1440p is semi heavy.

but BEYOND that, even at native resolutions, games do not scale like you expect them to.


take a look here, see how rtx 3080 renders 27 FPS at 4K, and only 76 FPS at 1080p.

That's a massive 4x pixel reduction. yet only 2.8x performance gain. now imagine the same scenario with 4K DLSS performance. you will be looking at maybe at best 2x performance gain. you can literally observe similar GPU bound performance relations on PC yet somehow does not even give a chance that it might still be GPU bound there because you're too stubborn with your "0.9 millions of pixel" argument.

I've repeated time and time again to you that 4k dlss/fsr performance and even ultra performance in most cases is more costly than rendering a game at native 1440p. one is working with 2 millions of pixels and other 3.6 millions of pixels. how does it make sense then??? how can you explain 4k dlss/fsr performance being more costly than native 1440p on PC? i just want an honest, clear response.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
you really have to stop with this pixel counts = performance correlation thing. games do not scale like you expect them to, MORE SO with upscaling.

8.2 millions of pixels, native 4k, rtx 3070, 28 FPS full gpu bound

zr935Ct.jpeg


4k dlss ultra performance, per your logic 0.9 millions of pixels, and... 58 frames per second. full GPU bound

FRR34wk.jpeg


don't you see what is being apparent here, really ? I can literally play this game at native 1800p-1900p and get 30 FPS all the time. But rendering internally AT 720P does not get me to 60 FPS.

and in case you think I'm magically somehow CPU limited here, here's the same scene at 1080p dlss quality, way above 70 FPS, so it was not CPU limited above (i mean stats don't lie anyways)

BoKrRVp.png



why can't you accept that GPU bound performance does not scale like you hope, wish or expect it to?
I think your CPU stinks. We have discussed this many times. Your 3070 Ti is being bottlenecked just like the PS5 GPU when trying to run games at 60 fps. its working overtime. upgrade you CPU to the 7800x3d and then you will see just how badly it was holding back your 3070 Ti.

Games do scale like that unless they are poorly optimized like say Star Wars. But those are rare exceptions. Ive been gaming on PC long enough to see this in action time after time. Game after game.

Regardless, if you dont think going from 8.2 million to 0.9 million just to double frames is a CPU issue then no point discussing this any further. Its plainly obvious to me, but I respect your opinion and lets just agree to disagree.
 

yamaci17

Member
I think your CPU stinks. We have discussed this many times. Your 3070 Ti is being bottlenecked just like the PS5 GPU when trying to run games at 60 fps. its working overtime. upgrade you CPU to the 7800x3d and then you will see just how badly it was holding back your 3070 Ti.

Games do scale like that unless they are poorly optimized like say Star Wars. But those are rare exceptions. Ive been gaming on PC long enough to see this in action time after time. Game after game.

Regardless, if you dont think going from 8.2 million to 0.9 million just to double frames is a CPU issue then no point discussing this any further. Its plainly obvious to me, but I respect your opinion and lets just agree to disagree.
I've literally shown my CPU being able to push 70+ FPS there. it is amazing how ignorant you are against solid proof. you've lost me here, and showed your true colors. I would find a different CPU, produce the same results, and you would still deny it being a thing.

if anything stinks here, it is you and your ignorance. congratulations, you've went to the extremes to deny facts. you've literally denied GPU being working to its full capacity there at %99 usage and 230 wats at 4k dlss ultra performance. this is the most hideous and disgusting thing I've ever seen on this forum. this even goes beyond the usual console fanboys.

to think I thought you were a decent person at all, I was hugely wrong.
 
Last edited:

yamaci17

Member
Come on let's not take this shit personally
they can't deal with the fact that the game couldnt hit 60 FPS at 4k dlss ultra performance coming from 28 FPS at native 4K. I could do the same test wtih a 7800x and he would still be in heavy denial. i will perfectly make it personal because if you're going to deny shit without having any solid proof of your own, you're just wasting people's times with baseless arguments and parroting about 0.9 millions of pixels all the time.

when all he could do against the face of solid proofs "your cpu stinks, your screenshots doesn't matter, it would be better with x", it is just bad faith argument and moving the goalposts. he knows i'm in a position of not being able to upgrade my CPU so he opportunistically uses to drive his points further. as a result it became personal.

anyone with a newer CPU and a 3070 is free to try 4k dlss ultra performance in paradiso and show him what this is about
 
Last edited:

SlimySnake

Flashless at the Golden Globes
I've literally shown my CPU being able to push 70+ FPS there. it is amazing how ignorant you are against solid proof. you've lost me here, and showed your true colors. I would find a different CPU, produce the same results, and you would still deny it being a thing.

if anything stinks here, it is you and your ignorance. congratulations, you've went to the extremes to deny facts. you've literally denied GPU being working to its full capacity there at %99 usage and 230 wats at 4k dlss ultra performance. this is the most hideous and disgusting thing I've ever seen on this forum. this even goes beyond the usual console fanboys.

to think I thought you were a decent person at all, I was hugely wrong.
You have ignored dozens of evidence ive posted.

This one literally shows that you could double your framerate if you switch out your CPU to the 7800x3D. triple if you use the same CPU Richard used. But you chose to ignore it and I still said I respect your opinion. if thats not decent i dont know what is.

You on the other hand went scorched earth lmao.

4hKogzU.png
 

yamaci17

Member
You have ignored dozens of evidence ive posted.

This one literally shows that you could double your framerate if you switch out your CPU to the 7800x3D. triple if you use the same CPU Richard used. But you chose to ignore it and I still said I respect your opinion. if thats not decent i dont know what is.

You on the other hand went scorched earth lmao.

4hKogzU.png
burden of proof is on you. find me a GPU benchmark which includes native 4k and 4k dlss ultra performance with a 3070 and show us how it is meant to scale in terms of performance, please, be my guest. I did my part, you do your part.
 

SlimySnake

Flashless at the Golden Globes
burden of proof is on you. find me a GPU benchmark which includes native 4k and 4k dlss ultra performance with a 3070 and show us how it is meant to scale in terms of performance, please, be my guest. I did my part, you do your part.
i thought i did exactly that. But thats precisely why i said lets respectfully agree to disagree because neither of us are getting through to the other.
 

yamaci17

Member
here's a test with 3060ti at 4k and 4k dlss performance with 13600k 5.5 ghz



4k native 32 fps (muh 8.2 millions of pixels)
rpqC8tg.png


4k dlss perf (muh 2 millions of pixels)
nL4RGP8.png


oh my! 4x pixel reduction but only 1.76x better performance! how did this happen!

it must be the 13600k 5.5 ghz. gotta test it with 7900x 3d, amirite?

explain this now SlimySnake SlimySnake why couldn't going from 4k to 1080p pushed 3060ti to 60+ fps???? you're free to find your own dlss ultra performance benchmarks. i couldn't find any. even if i did, i'm sure you would "slime" your way out of that somehow as well. maybe blame the game. move the goalpost. say 13600k is problematic. say it is starfield and not avatar. moving goalposts is your area of expertise, from the looks of it

explain to me how going from 8.2 millions of pixels to 2 millions of pixels can't make that 30 fps a surefire 60+ FPS. sure as hell your logic is so sound, so you will find an explanation. game is broken, amirite?
 
Last edited:

Zathalus

Member
Tbf it’s because we are aiming to compare the gpu and not the cpu do you really think a 13900k is not helping this card at least somewhat especially comparing to the ps5 performance modes?
If the GPU is pushing every single frame it can then putting in a more powerful CPU will net you exactly 0 fps. If the game is being limited by the CPU then absolutely a faster CPU would then help, but all the games tested in the original video can do over 60fps with an ancient Ryzen 5 2600 so I highly doubt the CPU of the PS5 matters here.

If the test included actual games that are heavy on the CPU then I would absolutely agree the test is useless, Flight Simulator drops on the XSX because of the CPU but, as mentioned before, none of the games tested hit the CPU particularly hard.
 

yamaci17

Member
i thought i did exactly that. But thats precisely why i said lets respectfully agree to disagree because neither of us are getting through to the other.
no you didn't.

you denied in that 3070 should've gotten more FPS at 4k dlss ultra performance. show me a benchmark where,

3070 gets 4k 28-32 fps and more than 90+ fps with 4k dlss ultra performance in Paradiso at high settings. (because a 9x pixel reduction should warrant a 3x performance improvement, right??)

or explain how 3060ti going from 8.2 millions at 30 fps cannot get over 60 fps with only 2 millions of pixels in the example I've given above.

spoiler: you can't.
at least not with your logic. you made this personal when you made that snarky comment that "muh cpu stinks". you started the disrespect, not me. your inability to accept that upscaling is super heavy on GPU is such a concept that you can't accept to a point you find it more appopriate to attack me with such snarky remarks. you've always denied, for some weird reason, how heavy upscalers are on GPU. but this is an altogether new low for you.

you're ignoring plain stats where it is clearly GPU bound at 4k dlss ultra performance in my testing. Don't expect me to take you seriously going forward.
 
Last edited:

yamaci17

Member
here's one last test at lowered GPU clocks to make it GPU BOUND AS MUCH AS POSSIBLE. in this location, my CPU is able to render 70+ frames when not bound by resolution

4k native (8.2 millions of pixels)
8reZ4TJ.jpeg


4k dlss ultra performance (0.9 millions of pixels)
ud6G6GT.jpeg


only 2.3x more GPU bound performance for a reduction of 8.2 millions to 0.9 millions

"Regardless, if you dont think going from 8.2 million to 0.9 million just to double frames is a CPU issue then no point discussing this any further. Its plainly obvious to me, but I respect your opinion and lets just agree to disagree."

if anyone has audacity to call the above test CPU bound, I don't know what to tell you.

a 9x pixel count reduction and only 2.3x more performance in a CONTROLLED FULLY GPU BOUND SCENARIO.

deal with it. or move on. don't waste any more of my time.
 
Last edited:

Bojji

Member
lol can we at least agree that the PS5 is CPU bottlenecked in this scenario?

3uyAvGS.jpg


Leave it to Richard to pair a $550 card with a $600 CPU. Im sure gamers everywhere are running this setup.

We can't agree, in this scene game is at lowest DSR on PS5 and still drops frames so that looks like GPU limit.

Avatar is not very CPU heavy:

5eU0c9w.jpg


Another game, another native 4k 30 fps quality mode and this time an abysmal 720p internal 60 fps performance mode.

Timestamped:


pixel counts 8.2 million vs 0.9 million.

I wonder why these consoles are not GPU bound when pushing native 4k pixels and only GPU bound when pushing 720p.


I have no idea why S&B is 720p, it's super bizzare to me but I'm 100% sure that it isn't CPU related , with FSR2 it's probably as heavy as 900p native. One thing that comes to my mind is that without DSR they wanted this game to perform in locked 60 all the time and adjusted resolution to the lowest possible value to that. It's strange because Anvil games had DSR so far.

Many developers did something like that, DSR allows these consoles to have much better IQ.

Richard about that CPU-GPU limit in 7900GRE review:

JQ47duJ.jpg
42FyJ4c.jpg


 
Last edited:

yamaci17

Member
2700X is a weak CPU to be using with the 3070, it can't even push a 2060 to its limits.
you're moving the goalpost, i did the test in a specific point where I'm super GPU bound and GPU is being fully fed. i could do the very same test I did above with a 5800x 3d and results would come out similar. when you're GPU limited, you're GPU limited.

GPU limited means getting %99 GPU usage, and that is really all there is to it.

the idiotic tests he keeps sharing see 2600x running the game at 40 FPS. yet i'm not conducting the test there. I'm conducting the test in a place where fully GPU bound, and even as a evidence, I've also shown I was able to get 70 frames per second CPU bound in that location.

IF my cpu is able to render 70 FPS in that very same location, what does 4k dlss ultra performance pushing 58 FPS on the 3070 has to do with my CPU? being at 58 FPS is way below what my CPU is capable of in that specific scene I'm benchmarking. how is this unclear so much that you people make such remarks?? I'm literally showing i'm able to get 70 FPS at 1080p with 2 millions of pixels and unable to get 60 FPS with 0.9 millions of pixels at 4k dlss ultra performance. this is plain refusal of how heavy upscaling is.
 
Last edited:

icerock

Member
Absolutely retarded comparison. he doesnt state what CPU hes using but hes definitely not using that AMD 6700U chip from china that he knows for a FACT is the PS5 CPU. Then he wonders why the 60 fps and 120 fps modes are performing well. Well, why do you think sherlock?

God i hate DF. Every time i stan for this fuckers they make me look like a fucking fool.

Oh and he didnt even downclock the 6700 to match the PS5 tflops. Not even for one test just to see how well the PS5 GPU compares to its PC counterpart with more dedicated VRAM and infinity cache.

To think he had 3 years to come up with the testing scenarios and completely blew it.

He's using top of the line CPU in i9-13900K, with ridiculously fast 32GB of DDR5 RAM at 6000 MT/s.

Absolutely pointless comparison. Reminds me of the time Richard got an exclusive interview with Cerny before PS5 launch whereby he asked how variable frequency works, and after being explained by Cerny himself. He decided to use RDNA1 cards by overclocking them and comparing them to PS5 specs (which obviously used RDNA2 and designed to run at significantly higher clock speeds) only to arrive at a conclusion how PS5 would be hamstrung due to "workload whereby either GPU has to downclock significantly or the CPU"

They are so transparent with their coverage, it's hilarious. Fully expect them to do comparison videos containing PS5 Pro with $1000 5xxx series GPUs from nVIDIA later this Fall.
 

yamaci17

Member
Bojji Bojji
4k fsr performance (1080p internal) is often heavier than native 1440p
4k fsr ultra performance often is as heavy as 1200-1300p despite internal resolution suggests

1440p fsr performance is often as heavy as native 900p (if you were referring to this one, yes)

but 4k dlss ultra perfrmance is leaps and bounds heavier than 900p, that's for sure, it is not even a contest.

this is sadly a concept that certain people are unabe to grasp, and refuse to acknowledge and comprehend. this is mostly because they think resolution is all there is to performance. they refuse to take many things that does not scale with resolution at all into the mix and complain when supposed 720p does not perform like how it should.
 
Last edited:

Senua

Gold Member
He's using top of the line CPU in i9-13900K, with ridiculously fast 32GB of DDR5 RAM at 6000 MT/s.

Absolutely pointless comparison. Reminds me of the time Richard got an exclusive interview with Cerny before PS5 launch whereby he asked how variable frequency works, and after being explained by Cerny himself. He decided to use RDNA1 cards by overclocking them and comparing them to PS5 specs (which obviously used RDNA2 and designed to run at significantly higher clock speeds) only to arrive at a conclusion how PS5 would be hamstrung due to "workload whereby either GPU has to downclock significantly or the CPU"

They are so transparent with their coverage, it's hilarious. Fully expect them to do comparison videos containing PS5 Pro with $1000 5xxx series GPUs from nVIDIA later this Fall.
What are you suggesting? They're anti Sony?
 

Bojji

Member
Bojji Bojji
4k fsr performance (1080p internal) is often heavier than native 1440p
4k fsr ultra performance often is as heavy as 1200-1300p despite internal resolution suggests

1440p fsr performance is often as heavy as native 900p (if you were referring to this one, yes)

but 4k dlss ultra perfrmance is leaps and bounds heavier than 900p, that's for sure, it is not even a contest.

this is sadly a concept that certain people are unabe to grasp, and refuse to acknowledge and comprehend. this is mostly because they think resolution is all there is to performance. they refuse to take many things that does not scale with resolution at all into the mix and complain when supposed 720p does not perform like how it should.

Yep, reconstruction comes with a cost. Still there is no logical reason for S&B to be 4K 30 and performance with internal 720p reconstructed to 1440p, there is a lot of performance between it left (in theory).

What are you suggesting? They're anti Sony?

For sure:

phil-spencer-digitalfoundry.gif


/s
 

Gaiff

SBI’s Resident Gaslighter
This discussion has gone on well past the point of productivity. It's getting to the point where we're showing 2700X not being bottlenecked in areas other people argue the PS5 is. While there might be contention over this matter, it doesn't change the final conclusion; the PS5 is either poorly utilized or poorly constructed. You know there's been a fuck-up when we have to run around testing archaic CPUs just to bring them down to the level of the PS5.

This reminds me of when we had to to get cheapass Intel G3450 to drop the CPU performance to console levels and even then, they still mollywhopped the PS4/X1. At this point, it's on Sony or Microsoft for outfitting their machines with anemic CPUs. We can't just keep digging to the bottom of the barrel just to find a CPU that's equal to them under the guise of fair comparisons, bottlenecking the PC GPU in the process.
 

yamaci17

Member
Tbf it’s because we are aiming to compare the gpu and not the cpu do you really think a 13900k is not helping this card at least somewhat especially comparing to the ps5 performance modes?
problem is actually more complex than that

here's why

at 4k 30 fps modes, ps5 probably uses lower CPU clocks/lower CPU power to fully boost GPU clock to 2.2 GHz
at 60 fps modes, it is highly likely that CPU needs to be boosted to its maximum clock, and my guess is that GPU clock gets a hit as a result (probably to 1.8 ghz and below). so technically, IT IS quite possible that PS5 in performance and quality mode framerate targets do not have access to the SAME GPU clock budget. think of like GPU downgrading itself when you try to target 60 FPS and have to "allocate" more power to CPU.

this is just one theory of mine, can't say for certain. but likely. why ? because PS5 does get benefit from reducing the resolution, even at below 900p. if so, then it means it is... GPU bound. if it it is gpu bound even at such low resolutions, then it means a better CPU wouldn't actually help, at least in those cases. but the reason it gets more and more GPU bound at lower resolutions is most likely because of the dynamic boost thing. PS5 has a limited TDP resource and it has to share it between CPU and GPU.

Imagine this: if your game is able to hit 50 or 60 FPS with 3.6 GHz Zen 2 CPU, it means you can reduce CPU clock up to 1.8 GHz and still retain 30 FPS. This means that at 30 FPS, you can reduce CPU TDP as much as possible and push as much as peak GPU clocks possible at 4K mode.

in performance mode, you need all the power you can deliver be delivered to CPU to make sure it stays at 3.6 GHz peak boost clock. that probably reduces the potential limit of GPU clocks as a result.

this is a theory / perspective that actually no one has brought up in this thread yet, but it is potentially a thing. we know that smart shift exists, but we can't see what kind of gpu and cpu clocks are games running at in performance and quality modes. if CPU clocks gets increased and GPU clocks get decreased in performance mode, you will see weird oddities where a big GPU bound performance delta happens between these modes. this is coupled with the fact tht upscaling is HEAVY. And when you combine these both facts, it is probable that PS5 while targeting 60 FPS is having a tough time in general, whether it is cpu limited or not, that's another topic.

this is why you cannot think ps5 as a simple cpu+gpu pairing.

if there was a way to see what kind of GPU clocks PS5 is using in performance and quality modes in these games, that would answer a lot of questions about what is being observed here.

and this is why seeing comments like "he didnt even matched the GPU clocks to 2.2 ghz" funny. this is assuming ps5 is running at 2.2 ghz gpu clocks in performance mode. why be certain ? why is smart shift a thing to begin with ? can it sustain 3.6 ghz cpu clocks and 2.2 ghz gpu clocks at the same time while CPU is fully used? I really don't think so. See how much CPU avatar is using. it must be using a lot of CPU on PS5 at 60 FPS mode. so that means a lot of TDP is being used by CPU. Then it means it is unlikely that PS5 runs its GPU at 2.2 ghz in performance mode in avatar and skull and bones (another Ubisoft title i'm pretty sure uses %70 of a 3600 to hit 60 fps on PC, just go check out). maybe we can extrapolate how much of a GPU clock reduction PS5 gets. but even then, it would be speculation.

it didnt matter up until this point, because even 2 GHz zen 2 CPU was enough to push 60 fps in games like spiderman and god of war ragnarok. (considering you can hit 30 fps with 1.6 ghz jaguar in these titles). so even at 30 and 60 fps, games like gow ragnarok and spiderman probably had 2.2 ghz gpu clock budgets to work with. and now that new gen games are targeting BARELY getting 60 fps on the very same CPU while using all of it (unlike crossgen games like gow ragnarok or hfw), we probably see GPU budget being challenged by CPU budget more aggresively with some of the new titles.
 
Last edited:

Loxus

Member
problem is actually more complex than that

here's why

at 4k 30 fps modes, ps5 probably uses lower CPU clocks/lower CPU power to fully boost GPU clock to 2.2 GHz
at 60 fps modes, it is highly likely that CPU needs to be boosted to its maximum clock, and my guess is that GPU clock gets a hit as a result (probably to 1.8 ghz and below). so technically, IT IS quite possible that PS5 in performance and quality mode framerate targets does not have access to SAME GPU budgets. think of it like GPU downgrading itself when you try to target 60 FPS and have to "allocate" more power to CPU.
This is how I interpreted how the PS5 Variable clocks and Smart Shift works.

iwZS813.jpg

The CPU has a Max power budget of 50W.
30W is enough to run at Max clocks and the GPU can only take a maximum of 20W from the CPU.

Variable Clocks can also be for BC mode.
 

yamaci17

Member
This is how I interpreted how the PS5 Variable clocks and Smart Shift works.

iwZS813.jpg

The CPU has a Max power budget of 50W.
30W is enough to run at Max clocks and the GPU can only take a maximum of 20W from the CPU.

all good then. it might play a little role. i still believe most 720p games are gpu bound though. if not, then their devs are ultra dumb for pushing for lower resolution in a cpu bottlenecked case and make a bad name for themselves xD

is there no scenario where cpu can use 50w though ?
 
Last edited:

Bojji

Member
all good then. it might play a little role. i still believe most 720p games are gpu bound though. if not, then their devs are ultra dumb for pushing for lower resolution in a cpu bottlenecked case and make a bad name for themselves xD

is there no scenario where cpu can use 50w though ?

If smartshift was the culprit then Series X version wouldn't be 720p as well, this console has fixed clocks.
 
Top Bottom