this post was submitted on 16 Feb 2024
394 points (99.7% liked)
196
16752 readers
2663 users here now
Be sure to follow the rule before you head out.
Rule: You must post before you leave.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
On a similar vein, Arkham Knight (and in some cases Arkham City) looked worse in cutscenes if you maxed out the graphics settings. Obviously not if you ran it on a potato, but the games are somewhat well optimized these days*.
*At launch, Arkham Knight was an unoptimized, buggy mess. It has since gotten much better.
I am playing through Rise of Tomb Raider in 4K and having a similar experience. I think the cut scenes are in 1080p.
Wait you mean that the game’s gameplay looks better than the actual cutscenes in the game?
But how? Does the game use FMV for the cutscenes or something?
The cutscenes were rendered using certain graphics settings that you could exceed if you maxed out your own settings. Plus, because it was a pre-rendered video, there must have been some compression or something, as you could just tell when you're in a cutscene-- it was grainier and there was a smidge of artifacting. Don't quote me on this, but I believe the cutscenes were rendered at, like, 1080p, and if you were playing at 4K it would be a very noticeable downgrade. (Note that I did not and still do not have a 4K monitor)
Although thinking about it again, I do vividly remember some in-game-engine cutscenes in Arkham Knight. I'll have to replay that game again sometime to jog my memory.
On PS5 Hogwarts Legacy runs at 60fps but the cutscenes are 30fps.