I think you're looking back with rose-tinted glasses.
The 360/PS3 was a huge jump forward but very limited by today’s standards. RDR was one of the better looking games of the generation but could not maintain a steady 30fps at 1080p/i (and I’m not sure it was even true 1080).
The PC version came later, had higher resolution textures and other graphical improvements so it compares more favourably to modern games when you play it today. It still had problems running on all but the highest-end PCs of the time.
Of course even low-end PCs can run it without breaking a sweat, because they’ve become much more powerful.
Most Xbox360 and PS3 games were 720p at 30fps. 720p was mostly fine because 1080p TVs were luxury items back then.
The performance problems in modern games are often not caused by fillrate-vs-resolution bottlenecks though, but by poor engine architecture decisions (triggering shader recompilations in the hot path).
Shader recompilation causes stuttering not general performance problems. Shader complexity will though, which is a function of render quality.
But I’m confused about why you think fill rate isn’t an issue? If you are now upgrading from 1080p to 4K your GPU needs at the very least 4x the pixel pushing power and even then that’s only to maintain the same detail; you bought a 4K screen for more detail.
> But I’m confused about why you think fill rate isn’t an issue?
Because this is can be easily dealt with via upscaling or buying a more expensive GPU, but fixing shader recompilation in the hot path requires a complete engine redesign.
There aren’t faster GPUs affordable to most consumers, that’s the point. Yes, DLSS is used as a crutch because it’s easier to do AI upscaling than render at a higher resolution.
You don’t need a full engine redesign. UE5 provides tools for PSO bundling and also pre-caching, but you need to use them.
Also good material design and structure helps reduce the number of PSOs needed but again, you need knowledge of how the engine’s materials system works.
Presumably people do this because they hate money; as you say, it's much harder to make the pixels just slightly more crisp and you'll pay dearly for the privilege.
I might be misremembering, but I seem to remember most games of that era were 540p scaled to 1080p. 720p would have been an upgrade. But your point still stands.
Remarkably RDR1 was only released for PCs late last year, ~14 years after the original release.
Maybe that is even related to it's good performance on consoles back then: Rockstar invested a lot of development time and sacrificed portability for performance. Basically the opposite of what modern games achieve with unreal 5.
The 360/PS3 was a huge jump forward but very limited by today’s standards. RDR was one of the better looking games of the generation but could not maintain a steady 30fps at 1080p/i (and I’m not sure it was even true 1080).
The PC version came later, had higher resolution textures and other graphical improvements so it compares more favourably to modern games when you play it today. It still had problems running on all but the highest-end PCs of the time.
Of course even low-end PCs can run it without breaking a sweat, because they’ve become much more powerful.