DF: A Plague Tale: Requiem PC vs. Consoles Graphics Comparison

  • Thread starter Deleted member 13
  • Start date

Darth Vader

I find your lack of faith disturbing
Founder
20 Jun 2022
7,366
10,930
To me that's bashing the developer and what they put out. They clearly have pushed graphics to the next-level and that's with a custom proprietary engine. The guys should be highly regarded along with the 1st party developers that you guys respect.

And there you go again with "you guys".

I respect what they did. Saying a game is poorly optimized is not shitting on them, it's stating a fact, a fact proven by the latest patch.

For example, I criticised forbidden west heavily for it's bugs and graphical glitches. They detracted from my experience. Many people criticised GoW Ragnarok for not pushing the graphical envelope on PS5.

Spare me the lecture.
 

Old Gamer

Veteran
Founder
5 Aug 2022
1,976
3,147
I respect what they did. Saying a game is poorly optimized is not shitting on them, it's stating a fact, a fact proven by the latest patch.
Well, they did fix a part of the engine, and all platforms run better as a result, it definitely is a good thing.

Having said that, Alex is trying to draw comparisons that don't really apply. He takes the 25% difference in framerates and tries to correlate it to specs in console First, the TFs differ by less than 20%, as he points out, so maybe it's the memory bandwidth. You can do that kind of comparisons with certainty across PC configurations running the same game build on the same OS (and even then, you do need to be informed about the game engine behavior), but not across completely different platforms. If you have to guess why, it means you don't know.

You can't draw that conclusion and pin it on hardware specs alone because you don't know the software implementation details - the SDKs are very different, with different sets of middleware. Again, based on past practices, very often developers take their DirectX implementation and use an adapter layer such as GNMX to translate from DirectX to what Playstation uses. Some developers use this as an intermediary step to port the code to something based on GNM, others don't - and this is not dissing any developer for choosing the latter, as there are plenty of legitimate reasons to cut down on time spent porting to a platform. That alone would very often mean the Playstation console is fighting a very different battle in terms of CPU time.
How much of an overhead this would add to your engine performance on Playstation depends on the engine implementation, so it's all second guessing really.

And yes, it goes both ways. A game running better on PS5 (and there exist more of those than some people here would have you believe) does not prove PS5 hardware superiority. It just proves that that particular game engine runs better on that hardware and software. There are no "correlations" to look for, and I certainly never saw DF look for them when that is the case (frequency? fill rate?), only in the opposite scenario, such as with this game or the Control photo mode.

Something I noticed since 2020, is how some people (including people who should by all means know better) are desperately looking for the teraflops difference to translate to a bigger outcome than it actually does this gen. Any reasonable person is going to expect the pure rasterization gulf from 10TF to 12TF to not be as wide as from 4TF to 6TF. In any case, full compute units saturation is not a realistic nor expected outcome of a well optimized game engine.