It gives me a headache looking at it. Like there's a filter over the image that can't be disabled. World looks a million times better.It's an ugly game.
It gives me a headache looking at it. Like there's a filter over the image that can't be disabled. World looks a million times better.
Smudging algorithms.
40 FPS being "most of the way".
Guys, whatever you are smoking, I want some.
Sure, but what's the point of a higher resolution with so low resolution textures?
Agreed. I tolerate it in some games on the Pro, but you feel the difference when coming back to PC. I totally support anyone on that platform who's like "fuck no". Once you get used to anything above 120fps, it's just feels off to go lower. But below 60 on PC? Forget about it.I remember when this 40fps talk came about because of the Steam Deck and 120hz TVs. I tried it one time with Spiderman on my big screen before I sold my PS5 and was like "Hell naw". It's cope in it's rawest form.
I think they made the same mistake they made with World: volumetric fog up the ass everywhere, which makes everything look dull and colorless.Why is the contrast so awful?
Yeah I don’t get it. To me it’s the biggest turn off, more so than performance.I think they made the same mistake they made with World: volumetric fog up the ass everywhere, which makes everything look dull and colorless.
Plus the color tonemapping is horrible. No contrast. And I don't get it, imo Dogma 2 looks better than this.
I wonder to what extent this was intentional. To show a drab, low contrast environment and then switch to vibrant biomes for visual variety. And then they totally forgot or didn't have time for the vibrant ones. It does seem to look better in other stages.... but... yeah... lolCapcom magic. This is the worst looking RE Engine game that has the highest requirements.
it's one of those fixes that some people say helps but I've never ever seen any difference disabling this in past 10 years.Has anyone tried disabling Control Flow Guard for this game, in the Windows Exploit Protection options?
This can help with some games.
eh sorry but I'm not with you on this one. while that is true that 8 GB budget is below consoles, 4060 is the most popular card and 5060 8 GB will be similarly positionedIf you are still on 8GB card in 2025 and expect good results it's only your fault. Writing was on the wall AT LEAST since beginning of 2023. Consoles have ~13GB available to devs so no wonder 8GB is not enough in non cross gen games.
This game has more problems than high vram requirements (while still looking like crap) but this is one of the major ones.
it's one of those fixes that some people say helps but I've never ever seen any difference disabling this in past 10 years.
There are number of these legendary "fixes".
I wouldn't waste my time. It's not the problem
And you know what this engine can do on PC, so it makes it that much more disheartening!Capcom magic. This is the worst looking RE Engine game that has the highest requirements.
Well, at least pcmr with 5k$ 2000W GPUs are proud that they can run ps3 looking game at 100 fps. The only thing that matters.
Probably not best to do it in an online game.Has anyone tried disabling Control Flow Guard for this game, in the Windows Exploit Protection options?
This can help with some games.
Global ilumination depending on the weather. The game starts with cloudy weather, as a sandstorm is not too far, it will give the game an washed out / grey look. As the main story advanced the clear sky weather starts to appear after the sandstorm event happens, this is when the color saturation looks good. So an artistic choice.Why is the contrast so awful?
Smoking opium laced, 100% marijuana joint. High and dont' give a fck.Smoking a blunt filled with cope(and a little cocaine). A primo.
I'd be genuinely shocked if they manage to achieve acceptable performance within the next 12 monthsLike any game, just wait for patches/updates.
For an open world game, actually I don't. DD2 looked better, but had a whole bunch of CPU issues right? It just seems like RE can't handle the scale and some major compromise is being made each time. It was CPU for DD2. It seems to be streaming and lighting for this one.And you know what this engine can do on PC, so it makes it that much more disheartening!
I wouldn't say proud. Why would anyone be proud? But yes I am glad it runs and looks much better on my PC than it does on your PS5 Pro.Well, at least pcmr with 5k$ 2000W GPUs are proud that they can run ps3 looking game at 100 fps. The only thing that matters.
eh sorry but I'm not with you on this one. while that is true that 8 GB budget is below consoles, 4060 is the most popular card and 5060 8 GB will be similarly positioned
its not that people are asking for high quality textures that consoles are running, but games having to run with ps2-ps3 textures is not good. games like avatar, indiana jones and star wars outlaws showed that you can scale textures in a much better manner for 8 GB GPUs.
then again, that 13 GB budget is for the whole game on consoles. there are a lot of things that can be just put to normal RAM on PC (at least 2-3 GB of it). I really think most AAA games use around 9-10 GB for GPU data and 3-4 GB for CPU data on consoles. going by that logic, all you need is to scale 2-2.5 GB worth of GPU data for 8 GB cards. it shouldn't be that hard. especially when they target 1440p/4K upscaled buffer with that 9-10 GB GPU budget. so targeting 1080p on 8 GB GPUs should already save a lot of VRAM (around 1-1.5 GB compared to 1440p and 2-3 GB compared to 4K). so a game that can run at 4K/30 FPS on consoles with 10 GB GPU budget should already run fine at 1080p with 8 GB VRAM (and in most games, this was the case).
which is ultimately why we keep seeing 8 GB cards being released because it shouldn't be that hard to scale games for 8 GB at a 1080p target with decent looking textures. many games have achieved this, and will keep doing so. then again yeah, I can agree with "don't expect good results". but I cannot agree with PS3-like textures.
nope. internal resolution doesn't matter that much for VRAM usage. I'm speaking from actual experienceMany console games target ~1080p with that 13GB budget so you really have no room here...
Let's assume that devs really can't realistically use more than 10GB as VRAM.
-8GB have less than 7GB usable for games.
-10GB cards have something like ~9GB
That's is still below consoles. 12GB is THE MINIMUM for this gen to play using console settings. Sure, UE5 games are usually very good with VRAM but that's exception and not the norm.
In ideal world devs would make good textures for 10, 8 and even 6GB cards because we had good textures ~15 years ago (on 1GB!) BUT current developers are talentless hacks. So always expect the worst from them!
Nvidia are bastards, this one isn't new. They are screwing xx60 users for years now (with the exception of 3060 12GB). Every new console gen buy card with the same amount of memory consoles have and never look back.
It's coming to something when people are asking for UE5 to save PC gaming.This has to be one of the most unoptimized AAA games ever. Definitely up there with gems such as Gotham Knights and Skylines 2. The MH developers at Capcom don't seem technically capable of properly utilising the RE engine. They would be much better using UE5 even though that engine is pretty heavy also. At least then there would be far more resource to help troubleshoot performance issues available to them. Hopefully they rebuild the game in UE next time round. Its unfortunate as the devs are in for a difficult few months which will undoubtedly affect their existing content schedule.
I definitely don’t take any pride in the performance of this. None. But, when there are open world games that look better AND run better, you k ow it’s the game’s fault and not the PC.Well, at least pcmr with 5k$ 2000W GPUs are proud that they can run ps3 looking game at 100 fps. The only thing that matters.
That was more along the lines of my point. Because this engine is usually a spectacle on PC, that it struggles with these types of games sucks ass!For an open world game, actually I don't. DD2 looked better, but had a whole bunch of CPU issues right? It just seems like RE can't handle the scale and some major compromise is being made each time. It was CPU for DD2. It seems to be streaming and lighting for this one.
In Japan right now, PCs and consoles are expensive due to the significant depreciation of the yen.There's a good reason why PS5 Pro is selling almost as much as the base PS5 now in Japan. Land of the rising sun and framerates.
Dunno how to do it for a particular game, but I believe that updating (or maybe just reinstalling?) your GPU drivers should clear all the shader cache from your PC.How do i force the shader compile process?
nope. internal resolution doesn't matter that much for VRAM usage. I'm speaking from actual experience
I can run alan wake 2 with path tracing on my 3070 at native 1080p without FPS dropping to single digits
with 4k dlss ultra performance (720p internal), I'm dropping to single digits. by reducing texture quality, I can get much higher framerates than native 1440p with 4k dlss ultra performance. 4K output is too costly on VRAM and upscaling doesn't save that much VRAM. in other words, 1080p is so light on VRAM compared to 4K dlss ultra performance that not only I can use higher quality textures, my framerates don't tank
one thing you don't seem to understand that running games at 4K output, regardless of the internal resolution, loads much higher quality textures that engine has to offer, higher LODs and overall higher image fidelity. running 1080p output, even with 1080p native resolution, is a huge reduction in image and texture quality by itself
or even more recent example, indiana jones. I cannot even run the game with 4k dlss ultra performance and low textures, tanks to single digits. I can run the game just fine at native 1440p with medium texture setting. if these two examples don't make this distinction clear to you, I don't know what will
I know what you are talking about. 4k reconstruction from 1080p takes more vram than native 1080p with 1080p output.
But many console games don't even target 4k with reconstruction, vram used for framebuffer is not that big compared to other things.
8GB cards are obsolete and most of tech YouTubers agree to that. Buying 8GB card even for 1080p Is a mistake in 2024/2025.
Why Nvidia still even still makes 8GB cards is beyond me...
3070 was really fucked by Nvidia, still has decent power but it's super limited for many new games when it comes to some setting.
I just reinstalled World/Iceborne and it looks miles better than this mess...
I have a 3070 in my gaming laptop and it works just fine as long as you don't use Ultra textures in some gamesI know what you are talking about. 4k reconstruction from 1080p takes more vram than native 1080p with 1080p output.
But many console games don't even target 4k with reconstruction, vram used for framebuffer is not that big compared to other things.
8GB cards are obsolete and most of tech YouTubers agree to that. Buying 8GB card even for 1080p Is a mistake in 2024/2025.
Why Nvidia still even still makes 8GB cards is beyond me...
3070 was really fucked by Nvidia, still has decent power but it's super limited for many new games when it comes to some setting.
My game completely skipped the shader compile process and run fine for now
- Shader pre-compilation takes 6 minutes on a 9800X3D and more than 13 minutes on a Ryzen 3600
- Awful textures even on High textures
- Major frame time spikes when turning the camera on a 4060-powered PC at High settings and 1440p DLSS Balanced
- The same awful textures are present using a 12GB RTX 4070 GPU
- Have to lower texture quality to Medium on 8GB GPUs to avoid stutters and frame time spikes, but they look even worse. Quickly panning the camera still causes frame time spikes, however
- When moving the camera slowly, those frame time spikes don't happen
- The frame time issue remains even with low textures
- This problem is mitigated on an RTX 4070. Alex suspects it's a streaming issue where the GPU decompression load is too much for lower-end GPU and causes massive frame time spikes
- Does not recommend the game for 8GB GPU owners and avoids caution even for upper mid-range GPUs like the 4070
- The game seems broken on Intel GPUs. Arc 770 is in the 15-20fps range with missing textures and other visual bugs
- You can brute-force those issues to a degree on a high-end system
- Right now, he cannot do optimized settings because it results in the game looking visually unacceptable
Another L for PCs
nah almost all games target 4k with reconstruction in their quality mode on consoles. monster hunter wilds is the exception not the norm. I don't remember any other console game not targeting 4k with ray reconstruction in its quality mode. list me 5 games that have no modes that targets 4k with upscaling. at this point you're arguing with bad faith, you also really don't know or understand what I'm actually talking about, so have a good day
also good luck with obsolete 8 GB VRAM on a laptop RTX 5070 and desktop RTX 5060 and the existing 4060s.
no matter what you or tech youtubers may believe, most games can and do scale decently to 8 GB at 1080p output. hence why they're still being released. so yeah, it is beyond you. what is beyond you is that the concept of scalability.
it's not like I'm suggesting anyone to get a 8 GB GPU. I don't really care. but it's not like there are many options either. what obsolete are you talking about? so everyone should get a 4070 or 5070 as minimum? what mistake are you talking about? who in their right mind would spend so much money on a 4070 or 5070 just to play at 1080p? 8 GB has its niche at 1080p and it will stay here for a while. whether you like it or not.
nice one using maxed out textures and settings that could be often better than consoles (in texture quality or in general quality). not to mention almost all the games you listed provide decent texture quality options that run decently with 8 GB cards even at 1440p. I never talked about using maximum settings or maximum textures. I didn't even talk about console equivalent textures.1080p fine with 8GB?
![]()
![]()
![]()
![]()
![]()
![]()
![]()
YOU are fine with it, great. I bet all those uninformed people that will buy 5060 this year will be "fine" too.
Intel released 10GB and 12GB GPUs recently.
nice one using maxed out textures and settings that could be often better than consoles (in texture quality or in general quality). not to mention almost all the games you listed provide decent texture quality options that run decently with 8 GB cards even at 1440p. I never talked about using maximum settings or maximum textures. I didn't even talk about console equivalent textures.
"its not that people are asking for high quality textures that consoles are running"
you're going directly to the ignore list
Don't ask me, it is a bit itchy in the village but in the open world and during the hunt is pretty smooth, 4k dlls quality, almost everything maxed out except shadows, clouds and distand shadows, around 60-80 frames, no framegen or reflex, i let gsync smooth things out.
I'm kinda with him, nobody can say it's perfect, but so far it's fine for most 8gb graphic cards, tho i would never recommend getting something like a 5060 8gb at this point, or even a 5070 12gb since it's a powerful one and framegen drinks some VRAM too.Most of the time those gpus have enough power to play with high res textures, just not enough vram... And most of those "ultra", "very high" textures are exactly console quality, anything below means worse textures.
Limit yourself to 1080p and then to medium (or lower) textures. What a great experience.
no idea what he talks about but none of those images are representative of what settings people actually run with 8 GB cards. no one in their right mind will play cyberpunk with path tracing on a 4060. even when you have enough VRAM, performance isn't there. and you can't even run ultra texture pool in indiana jones with 12 GB cards. I've played the game to 100% completion with medium texture pool option and only came across two poor textures the entire time.I'm kinda with him, nobody can say it's perfect, but so far it's fine for most 8gb graphic cards, tho i would never recommend getting something like a 5060 8gb at this point, or even a 5070 12gb since it's a powerful one and framegen drinks some VRAM too.
But for now, all these images could be fixed just by adding DLSS Q, you lose (or not) a bit of image quality but gain extra VRAM and a good chunk of frames.
And in the worst of cases, just change textures or whatever from ultra to very high or high, yes, consoles on quality can run at higher textures, but also run at 30 FPS on these modes
95% of games from N64 to PS5 look better than this. This is tragic how bad it looks. Monster Hunter Tri on Wii looks superb compared to this abomination.I think they made the same mistake they made with World: volumetric fog up the ass everywhere, which makes everything look dull and colorless.
Plus the color tonemapping is horrible. No contrast. And I don't get it, imo Dogma 2 looks better than this.