It may be the case that high end PCs with the latest expensive graphics cards can run modern games like cyberpunk at 240fps using things like DLSS, but console games regularly run at 30fps in “quality” modes and 60fps in “performance” modes. 30fps is absolutely an acceptable frame rate for the most people. If we didn’t have Digital Foundry and the like nitpicking dropped frames and counting pixels for dynamic resolution, I think even less people would care about high framerates, as long as the game is fun.
PS5 came out 4 years ago (November 2020) and has 4 games at 30 fps. Of 3140.[1]
PS5 and Series X both have an infamous problem with ports from older generations coming over at 30fps, because they were render-locked to that in the previous console generation, and no upgrade work was done during the port.
Beyond that, 30 fps comes about via opting into it via choosing raytracing at 4K.
It's a massive reach to describe 30 fps as commonly accepted. And that's without considering the context: it's being used to argue for the usefulness of a cloth simulation neural network that takes 18 ms to render on a 3090, not 2020 consoles.
Moreover, those screenshots aren't showing a game with some clothing simulation. They're showing a nearly-empty render except with some cloth simulation.
I think steelmanning is good and important for conversation, but its hard to swallow here. Blithe while taking too many liberties that are likely to shade rather than shed light.
> It's a massive reach to describe 30 fps as commonly accepted
Toy Story was rendered at 24 FPS and I don't know a single person who refused to accept it as-is. In fact, I hear more people complain when cinematics aren't run at a filmic framerate.
The end-to-end latency is horrible with 30fps. Move your mouse and you can feel the lag. In theory you can over-render the sides of the screen and move the rendered image around with the mouse but almost no one does that.