They're spending about 14ms on inferencing and 3ms on everything else, which technically is about 60fps, but in any real world application you're probably not going to be able to blow almost your entire GPU compute budget on shading a few pieces of cloth.
To their point, 30 fps is commonly acceptable, in which case you're spending 7-14ms on this, and 26-19ms on everything else. You can also use frame gen cheat 30 fps back up to 60 if one is so inclined.
Interesting to hear that, consumer market has been well past 30 fps for years, to my knowledge. For example:
- Phones are widely at 120 fps as of a couple years ago.
- Gamers have been targeting 240 fps for years and years, since 2018?
- On the low end, ex. Steam Deck requires you manually opt-in if you're happy with less than 60.
I'm curious for some groups you're aware of that commonly accept 30. (note: not a trick question, ex. we know movies are 24 fps, and professional 3D renderers are happy with not-real-time, but we're thinking about 3D rendering for games)
It may be the case that high end PCs with the latest expensive graphics cards can run modern games like cyberpunk at 240fps using things like DLSS, but console games regularly run at 30fps in “quality” modes and 60fps in “performance” modes. 30fps is absolutely an acceptable frame rate for the most people. If we didn’t have Digital Foundry and the like nitpicking dropped frames and counting pixels for dynamic resolution, I think even less people would care about high framerates, as long as the game is fun.
PS5 came out 4 years ago (November 2020) and has 4 games at 30 fps. Of 3140.[1]
PS5 and Series X both have an infamous problem with ports from older generations coming over at 30fps, because they were render-locked to that in the previous console generation, and no upgrade work was done during the port.
Beyond that, 30 fps comes about via opting into it via choosing raytracing at 4K.
It's a massive reach to describe 30 fps as commonly accepted. And that's without considering the context: it's being used to argue for the usefulness of a cloth simulation neural network that takes 18 ms to render on a 3090, not 2020 consoles.
Moreover, those screenshots aren't showing a game with some clothing simulation. They're showing a nearly-empty render except with some cloth simulation.
I think steelmanning is good and important for conversation, but its hard to swallow here. Blithe while taking too many liberties that are likely to shade rather than shed light.
> It's a massive reach to describe 30 fps as commonly accepted
Toy Story was rendered at 24 FPS and I don't know a single person who refused to accept it as-is. In fact, I hear more people complain when cinematics aren't run at a filmic framerate.
The end-to-end latency is horrible with 30fps. Move your mouse and you can feel the lag. In theory you can over-render the sides of the screen and move the rendered image around with the mouse but almost no one does that.
> - Phones are widely at 120 fps as of a couple years ago.
They're only briefly at 120fps during important things like full screen scrolling. Phones are battery powered and passively cooled, so rendering at 120fps would be very wasteful even if you can do it.
The important thing about 120fps is that a lot of things are factors of it; you can't play a 24fps movie on a 60fps screen.
All fair points. Phones have 120fps displays, but obviously can't run much at that rate. Some web pages, basic note taking, etc. But pretty much anything 3d is 30-60fps.
Consoles, hand helds usually aim for 30+
Real time rendering, video streaming etc, are 24-30.
If the nit is my phrasing, let me rephrase, the point is that it's usable in an interactive/real-time application. It's perhaps at the lower bar, but 30 fps is far from a rarity. Though not always ideal, it's certainly acceptable.
30 fps should not be acceptable given a 4090. Remember, most people have cards that are barely 15% the performance of a 4090, or less. The fact that almost all modern titles target the 4090 is a huge problem; only a very small number like Roblox actually understand how to optimize.
As a professional I usually use a 3090. I'm sure if I worked in fields that heavily relied on this sort of thing I'd have a 4090 or two no problem, but it's not that safe of a base assumption to make.