Until 4k 120hz is possible and affordable and playable on a low end gpu with good graphical settings, we're not at the good enough phaseDefinitely not.
We still have to use all kind of ugly tricks to make games run on modern hardware. Lighting relies on a mix between dynamic effects and pre baked maps.
And higher resolution (4K) will become mainstream over the next few years.
I would be probably be dead or too old. damn that sucks. I wish moore's law lasted till I die5-10 years to get dual 16K 120Hz+? 10 years ago most people here had 1600x1200 at 85Hz, 1280x1024 at 60hz, 1680x1050 at 60hz, or at the most 1920x1200 at 60hz. Today we are only just barely getting 2560x1440 144hz and 3840x2160 60hz. Dual 16K is 32 times the pixels of 2160. If you want it at 120Hz+, and you do, then double that number again for hardware intensity needed. So we have to go roughly 64 times from today. Even if we pretend everyone 10 years ago played at 1280x1024, going from that to 2160 is only a 6.3 pixel increase - and the same Hz.
Decades.
I hope you never die then. ()I would be probably be dead or too old. damn that sucks. I wish moore's law lasted till I die
ask yourself this, can games run CGI level graphics + physics yet? no? then it's certainly not good enough.
I think most of you are missing the point the OP was asking about. He's asking if GPU's have reached the point of good enough based on current software and the continually slowing pace of its improvements. CPU's reached good enough years ago where the average user has no real need to upgrade anymore. GPU's are sprinting in that direction as well if they haven't already reached that point. 10 years ago, if you bought a topend GPU at release, it was superseded within 6 months by a new arch or refresh, and within a couple years it was a paperweight. The industry as a whole has slowed way down, and the pace of advancements is down to a crawl. It took AMD over a year and a half to replace the 290x with the marginally faster 390x refresh and then Fury a bit later.
For the average user who at best is using 1080p, I believe GPU's have reached good enough, and that will continue until the masses eventually move to 2160p.
Pics
Good luck trying to find a game developer to achieve that in a commercial viable game. Until that happens whatever tech demo showcase #182798172 it's not going to matter in the real world.
Even if 1080P is a permanent standard, GPUs have an extremely long way to go before real-time path-tracing effects (such as Global Illumination, accurate glass, etc) become feasible.
This takes brute-force, lots and lots of brute-force, potentially in the petaflops range for 60 fps let alone 120+ fps.
Not to my eye. Once you get into cgi type-stuff, you can pick out flaws everywhere. One of those things you can't un-see.Are there any games available today that cannot played at visual settings acceptable to the average gamer at 1080p using a midrange GPU?