I blame Mark Cerny for not thinking about system wide libraries. Sony always goes for dev first approach which is very bad IMO.
In fairness - this predates Cerny's role in system design - Sony's always done dev-first all the way back to original PS1. And a lot of what made PS2 - special - was thanks to this approach (though admittedly, many titles also suffered because of it).
60fps patches would be streamlined if they could force it system wide for example.
This is a bit more questionable though. No console to date has done this (none of the Microsoft hw either), among other reasons because it handicaps one of the main advantages developers have on consoles. But ok - if Sony actually opened up the refresh rates that displays support, I'd be willing to concede this could be a win.
There's a LOT of games out there that would run perfectly at 50fps with none of the VRR flicker and dimming issues (and smoother than fake 40fps modes) - but console makers refuse to give it to us.
Same goes for PSSR version updates.
This can't be system
forced - titles that were tested against one version of PSSR can't be revalidated if system forces an update. However, I
could see a rationale to allow
users to select PSSR revision themselves - it's very PC and not at all console like, but then that ship sailed when the first Pro console launched, and we have HDR and other system settings now.
Agreed, they need system wide toggles for capping frame rates and resolutions and let the developer control default settings.
I mean in a world where most 3rd party engines can't frame-pace worth for shit out of the box - I am really not sure I'd want to add PC-style driver hacks into consoles on top.
The one thing I actually liked (as a dev) on PS was having full control over the swap-chain. No such thing as system/driver sneaking in extra frames of latency or doing something backstabby on present - even ability to do my own async-present cycles etc.
Yea maybe it's quaint in the 'PC for everyone' world but I really liked when console dev still was about caring for these things and we didn't need a GPU vendor ship entire libraries for 'anti-lag' to do the same things just with more obfuscation and less control.