Durandal wrote:Game development is already hitting a ceiling when it comes to increasing graphical fidelity.
That's been a fairly common idea for some time, but new GPU hardware persists. Almost nobody really intends to buy another expensive upgrade, do they?
All the same, I don't think the upgrade cycle is going anywhere. There's also the fact that us olds witnessed huge jumps in gaming's infancy that are unrealistic in the modern world. Improvements are more subtle these days. I'm spoiled, because I remember when new games could make your jaw drop. I can't remember the last time I was blown away by a game's visuals--because of raw technology. Art design is king. On the other hand, it's pretty easy to boot up a nice looking game like Prince of Persia: The Sands of Time followed by the latest Assassin's Creed and see how all those incremental improvements to visuals have stacked up.
Machine learning matters here for more reasons than you mentioned. ML is part of the solution with assets and it's going to start taking on some of the work of asset creation--and make artists more productive. In fact, I expect that teams will eventually shrink--and the computers will "take yer job". AAA titles will continue to push the envelope and ML will be doing a lot of work on the art. The core of the design team won't need as many assistants; the machine will do a lot of the work once the concept art is in place. The ML army won't ask for a living wage.
There's also superfluous stuff that will eventually get more attention. For instance: Now that I can have a computer design all unique trees with individual leaves, I need to render them.

So pretty! Diminishing returns or not, devs will do it if it's essentially free--and ML will eventually make getting extremely complex and uniquely crafted common assets very cheap for large development houses.
Ray traced lighting is going to be the standard as well. When developers are finally able to use it globally, the difference will be very noticable. When that becomes standard, legacy GPU will be out in the cold. Of all the recent hype, I like ray tracing the best. It makes a big difference.
Finally, one of the best ways to fight shimmering/aliasing is over sampling and downscaling. We still haven't solved the issue of delivering sharp images without shimmering. Rendering at a very high resolution and AI downscaling outperforms blurry anti aliasing and multisample techniques, particularly when the viewport in motion. It really doesn't matter if the textures are optimised for the initial super resolution hidden rendering, you'll still get better results than antialiasing or multisample blurriness.
Also, games and rendering API's will eventually use resources because they are there--eliminating the ability to reduce visual fidelity to run games on legacy GPU hardware. I agree diminishing returns have been in play for some time already, but I don't think it matters much. The cycle of incremental improvement will continue. At some point, it's just a question of rather or not new software will still run on your rig.
There's really no escape if you want to keep gaming. It's been possible to get by for a long time if you bought a flagship GTX 1080ti, but the party is almost over. You'll eventually have to upgrade or buy a console.
Edit: And, just like that, I see Nvidia announced a milestone in their budding ML game asset creation software--specifically designed to feed assets directly into existing development pipelines and workflow. People will create concepts, specify the mood of the scene, and define the direction of the art design. The ML will work around the clock putting things together. A lot of mundane asset work will be automated.