They've literally only ever went up
When more compute is available developers use it. Either to add features or to not optimize the code and just let the processor do the heavy lifting.
Now and then some outlier applications do become more computationally efficiency. PureRaw is one that comes to mind but I'm having a hard time thinking of another!
If compute "topped out" for one reason or another then we'd see developers really get to work figuring out ways to increase both features and speed while decreasing or keeping computational overhead constant. A great example of that woud be the old Atari 2600 game console. The unit was so ubiquitous that even when it was far outclassed by newer hardware developers were racking their brains to create games that were somewhat competitive with the newer hardware just because the user base and therefore possible sales base was so large. It was worth the effort to work out all sorts of clever tricks to max out the hardware.
Due to the limit of the human eye I think we are seeing a similar effect with TV's and displays. The move from SD (interlaced no less!) to 720p or 1080p was a massive increase in visual fidelity. While the move from 1080p to 4k is technically just as large an increase in resolution (4x) the visual results are much less pronounced for basically two reasons. First, depending on the distance from the screen and size of it we are reaching limits to human vision, especially in the real world where many people aren't even 20/20 corrected. Second, it takes really good cameras, lighting, lenses, operators, and post production to actually make each one of those 4k pixels count.
I have my doubts that 8k will even ever become a mainstream consumer format. Professional format? Absolutely, working at high resolution provides greater latitude and ability for cropping while preserving quality when going to 4k in post.
I think where we are going to see the big push for higher resolution is with monitors and gaming. What happens if desktop resolution stalls at 4k? Eventually GPUs capable of gaming at 4k "slide down the stack" possibily to the point where the lower end cards are capable of decent 4k. Sure, developers will find ways to waste compute cycles on fx that are not or barely visible at the request of GPU manufacturers but consumers are pretty smart and will see the lack of difference with their eyes and just turn them off and happily game away.
Anyway, the push for more compute is complicated and driven by market forces as much or more than consumer demand is my point.