Question GPU performance out-growing CPU performance; GPUs to cause more CPU "bottlenecks" not less, in future? [DanielOwen]

VirtualLarry

No Lifer
Aug 25, 2001
56,446
10,114
126

He's got some charts and graphs to back up his supposition. BTW, he's a math teacher he mentions.

Talks about DLSS 3.0 and frame-generation too, as a use-case that NVidia implemented based on "excessive" GPU performance in a system.
 
Last edited:

TheELF

Diamond Member
Dec 22, 2012
3,990
744
126
Irrelevant, you do not need 100% of your CPU to drive 100% of your GPU...



DirectX until 11 used 1 core to send all of the data to the GPU, and now dx12 vulkan and so on use much less than one core worth of compute spread over multiple cores.
 

coercitiv

Diamond Member
Jan 24, 2014
6,384
12,803
136
Talks about DLSS 3.0 and frame-generation too, as a use-case that NVidia implemented based on "excessive" GPU performance in a system.
This video should have stayed a shower thought. The data he uses for the GPU graph, the one that scales better than the "CPU gaming perf", is obtained using both GPU & CPU. This alone throws a wrench in the whole argument.

Moreover, some GPU architectures are inherently more CPU limited, because they introduce a "dual" CPU load. This comes from both faster FPS but also from the driver overhead, the famous "software scheduler". Here's a "slower" card beating a "faster" card because of different scaling in a CPU limited scenario. This shows that one GPU arch is piggybacking on CPU performance - which I do not call out as a bad thing per se, but should be heavily considered when having thoughts about GPU vs. CPU scaling.

Want more? Ray tracing. GPUs aren't really ready for it, but we get it anyway because sales. And the work is done entirely on the GPU. Right? Nope, CPU workload increases significantly when enabling RT. The PC is currently missing important hardware accelerators that would make CPU work a lot easier. The only good news is DirectStorage, which will relieve the CPU of work during some types of games.

Last but not least , the most bitter of reminders. GPU perf/dollar increase has been declared dead by Nvidia. According to them the days of easy GPU scaling are over. Therefore, guesstimating that GPU gaming performance will grow on the same exponential curve in the foreseeable future is extremely optimistic.
 
Last edited:

Mopetar

Diamond Member
Jan 31, 2011
8,004
6,446
136
I watched Daniel for a short time.

I feel he is a crackpot. His videos seem very geared to pull views.

All video creators are trying to gain views. I guess this is a side thing for him, but if he gets big enough it makes him money or let's him turn it into a full time job.

I've only seen a few of his videos, but at least he backs everything up with his own data and shows enough about how it was collected so that it could be reproduced or that if there are flaws they can be pointed out.

Last but not least , the most bitter of reminders. GPU perf/dollar increase has been declared dead by Nvidia. According to them the days of easy GPU scaling are over.

I wouldn't believe someone who has a strong financial incentive to tell you that so you'll happily eat the increased prices.

Zen 4 with v-cache is likely going to unlock a lot of additional performance in several titles. Games moving to newer engines in the next few years will also shift a lot of load back towards the GPU.

Eventually we'll see 8K gaming become a thing and suddenly we'll see cards struggling to hit the acceptable frame rates that people have grown accustomed to. Never mind that we're now in an era where you don't need two top cards in SLI/xFire to be able to get 60 FPS in new titles.
 

Mopetar

Diamond Member
Jan 31, 2011
8,004
6,446
136
We both know it's only half a lie, that's why Jensen figured he'd be able to get away with it.

I'd say it's maybe 15% true at most. Calling it a half truth is being far too generous.

Just because something is believable doesn't make it any more or less true. He knows he'll get away with it because it's the plausible and that customers will prefer believing that to the actual truth.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Zen 4 with v-cache is likely going to unlock a lot of additional performance in several titles.

Zen 4 with V-cache will change the landscape no more than Zen 3 with V-cache did.

This is an endless debate just like the one about iGPUs taking over dGPUs. iGPUs have been "killing" dGPUs for 30 years now. Or that this is the Year of Linux.

Ray Tracing alone changes the thing altogether. And if @coercitiv is right, by introducing more features to take more CPU advantages away, what happens is it results in less pure performance gain because specialization is faster.

Another endless game. CPU vs GPU. I was in High School when Voodoo came out. Now I am almost 40. The debate has been going on since then.
 

Borealis7

Platinum Member
Oct 19, 2006
2,914
205
106
there is another factor to consider: GPU processing power is increasing each generation, but no new resolutions for monitors are introduced. thus we will have "spare" processing power on the high-end GPUs for some time until 8K or whatever comes along to drive the need for more processing power. RT will become a "solved" problem just like 1080p rendering is now, compared to how it used to be a few years ago, and i believe 1080p is still the most common resolution in the world.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,446
10,114
126
GPU performance is increasing too fast... ahahahahahahhhahaha!
Well, I guess performance/watt has not, then.

Desktop GPUs, if you remember recently introduced a new 12+4 power connector, due to the increasing power demands of the newest crop of GPUs. So their TDP and physical size are increasing.

Laptops, have more of a fixed, limited TDP window, so it's not surprising that performance under those conditions isn't increasing generation to generation that much lately.
 

coercitiv

Diamond Member
Jan 24, 2014
6,384
12,803
136
Laptops, have more of a fixed, limited TDP window, so it's not surprising that performance under those conditions isn't increasing generation to generation that much lately.
Nah, it's far more simple than that:

3070Ti Mobile is using GA104 @ 392mm2
4070 Mobile is using AD106 @ 190mm2

They're essentially selling a slightly improved shrink, that lacks the execution units to be even more energy efficient at this performance target. The selling proposition is entirely centered around Frame Generation.
 
Reactions: Tlh97 and Saylick

blckgrffn

Diamond Member
May 1, 2003
9,197
3,182
136
www.teamjuchems.com
Nah, it's far more simple than that:

3070Ti Mobile is using GA104 @ 392mm2
4070 Mobile is using AD106 @ 190mm2

They're essentially selling a slightly improved shrink, that lacks the execution units to be even more energy efficient at this performance target. The selling proposition is entirely centered around Frame Generation.

Sad emoji response needed.

I've been disappointed with the number of laptops that could just have an 680M in them and they shove some stupid 3050 or similar in their with it. No. I don't want hybrid graphics again, especially with two different vendors.

The days of DGPUs in laptops at or below the 1650 level should be dead. Both Intel and AMD have solutions to that problem that make way too much sense. The extra chip and TDP overhead is silly - if you want a "real" dgpu the laptop should be sized accordingly.

And yeah, yeah, why should nvidia care, etc. I would think they would care because of how large that segment is/will be.

My $.02.
 
Reactions: Tlh97

leoneazzurro

Golden Member
Jul 26, 2016
1,005
1,598
136
Nah, it's far more simple than that:

3070Ti Mobile is using GA104 @ 392mm2
4070 Mobile is using AD106 @ 190mm2

They're essentially selling a slightly improved shrink, that lacks the execution units to be even more energy efficient at this performance target. The selling proposition is entirely centered around Frame Generation.

Which is exactly what I foresaw looking at the specs but some people believe in miracles instead of physics. 4080 Mobile (while not deserving the name) is good enough and the 4090 mobile is too power limited and not worth the price increase over the 4080.
 

pj-

Senior member
May 5, 2015
481
249
116
there is another factor to consider: GPU processing power is increasing each generation, but no new resolutions for monitors are introduced. thus we will have "spare" processing power on the high-end GPUs for some time until 8K or whatever comes along to drive the need for more processing power. RT will become a "solved" problem just like 1080p rendering is now, compared to how it used to be a few years ago, and i believe 1080p is still the most common resolution in the world.

I don't think it's inevitable that monitors and TV's (that people actually buy) will continue increasing in resolution just because it's possible. The resolution wars on phones died out a long time ago and screens have even regressed back to around 1080p from the silly peak of 4k

I doubt 8k will ever be common, it just doesn't seem necessary. There's also an upcoming limit on the impact of higher refresh if we haven't already hit it at 240-360hz
 

TheELF

Diamond Member
Dec 22, 2012
3,990
744
126
I doubt 8k will ever be common, it just doesn't seem necessary. There's also an upcoming limit on the impact of higher refresh if we haven't already hit it at 240-360hz
At some point it will become that cheap that it will make no economic sense for manufacturers to make anything lower anymore.
It's far away in the future but that is included in 'ever' .
Flat screens aren't necessary compared to CRT, they are just better, 8k won't be necessary it will also just be better.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,692
136
Or that this is the Year of Linux.

Just a quick OT comment on that.

It came and went, and (almost) nobody noticed. Just not in the form most expected.

One word; Android. Lets face it, for most people, their phone has become their personal computer. A good chunk of those run Android/Linux. Linux also powers a great deal of infrastructure. F.x. even my router runs the Linux kernel.
 

Mopetar

Diamond Member
Jan 31, 2011
8,004
6,446
136
Just a quick OT comment on that.

It came and went, and (almost) nobody noticed. Just not in the form most expected.

One word; Android. Lets face it, for most people, their phone has become their personal computer. A good chunk of those run Android/Linux. Linux also powers a great deal of infrastructure. F.x. even my router runs the Linux kernel.

The original was Year of the Linux Desktop with the idea being that Linux would take off in the consumer market. Even when the notion first came into greater Internet consciousness, Linux was already pretty popular in the server market and it was the geeks using it in that capacity wondering when it would supplant Microsoft in the consumer space.

Phones are generally good enough to be a desktop replacement for anyone who just wants to consume media or do a few light tasks on top of what they could accomplish even with a dumb phone, but I can't see being able to do even 10% of my job with only a smart phone.

The year of the Linux desktop may someday occur, but I think that will only happen if desktops become so unpopular that the only people who use them are Linux users.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
The year of the Linux desktop may someday occur, but I think that will only happen if desktops become so unpopular that the only people who use them are Linux users.

Tech people are in a way dumbest people in the world. They are so brilliant in many ways, but their brilliance turns them into an alien.

No one wants a hammer that only works with certain types of nails. If it doesn't work for one application and that's the one relevant for a particular user, you throw it away. Price, weight, grip, does not matter.

Windows is compatible with everything. No caveats. Yea, maybe Linux should have been that OS. But it's not. 95% or so people use it for PCs, and for a reason. The Linux zealots are those aliens.

When I used Ubuntu my Victron Solar charge controller app said something about how it's not guaranteed. It did work, but such things really suck. There are many unmentioned applications that just work on Windows. If I feel nostalgic and I want to install Simcity 4, it just works. Age of Empires original Demo? Sure. The Windows team should be commended for that. And they are, by the market.

The ticket system for the Skytrain in the Vancouver region uses the embedded version of Windows by the way. So many modern cash registers and big screens all run on Windows. Windows presence is much, much larger in the consumer than you think. That's all thanks to familiarity and rock solid compatibility that spends over 3 decades!

Also you need to take out the command console and not teach the users without giving them warning not to use it until they are ready. It's not foolproof. Android is locked to the point that you lose most of what Linux is. I sometimes get pissed off at the amount of commands I need to learn for my tasks.

Forget desktops. You need laptop users too. This battle will never result in Linux winning. PCs will exist alongside smartphones so will Windows and Linux will have that 5%. Only macro conditions will change that, such as say Microsoft becoming bankrupt and this war turning into a World war.
 
Last edited:
Reactions: Joe NYC

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,794
21,529
146
The discussion becoming about Linux is hilarious. In this case, why not? After all the topic of the discussion seems silly. Also, sometimes shooting the messenger is appropriate, fallacy or not.

Since getting an Nvidia handler, Daniel-San has been trying to come up with clever ways to pimp their tech without alienating the AMD and Intel users. "Hey guyz! Frame generation is actually a good thing! Your CPU is too slow to keep up with these Nvidia Monster GPUs."

Or maybe devs could leverage more threads, and optimize the games better? Instead of releasing broken trash that seems intentionally designed to showcase Nvidia's latest tech.
 

blckgrffn

Diamond Member
May 1, 2003
9,197
3,182
136
www.teamjuchems.com
Or maybe devs could leverage more threads, and optimize the games better? Instead of releasing broken trash that seems intentionally designed to showcase Nvidia's latest tech.

That costs money in the form of time and effort, which is the worst resource sink hole of all. It's been interesting following some of the crews I backed on Kickstarter and therefore had more transparency (the good ones), like the one who made Pillars of Eternity and Wastelands, and they had to boot up new titles while others had not launched to avoid laying off staff because they were idle since game creation is a multi stage pipeline.

It's likely the core engine experts who did the heavy lifting in setting up the content framework and requirements in the beginning are well off to other projects by the time a game launches and the core tech, and all its drawbacks, are set in near stone years ahead of time. Then the lipstick on a pig performance uplifts from DLSS & FSR look miiiiiighty attractive as it helps your unoptimized mess run acceptably on a wider range of hardware. And you ship it.

I am pretty sure that I brought this up as a con to DLSS back in the day, and FSR too. That over time whatever performance lift it brought on optimized games would get swallowed by future titles that simply took it for granted and used it to run acceptably. The real benefits? Games can be made more cheaply? Yay.

"Progress."
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,794
21,529
146
That costs money in the form of time and effort, which is the worst resource sink hole of all. It's been interesting following some of the crews I backed on Kickstarter and therefore had more transparency (the good ones), like the one who made Pillars of Eternity and Wastelands, and they had to boot up new titles while others had not launched to avoid laying off staff because they were idle since game creation is a multi stage pipeline.

It's likely the core engine experts who did the heavy lifting in setting up the content framework and requirements in the beginning are well off to other projects by the time a game launches and the core tech, and all its drawbacks, are set in near stone years ahead of time. Then the lipstick on a pig performance uplifts from DLSS & FSR look miiiiiighty attractive as it helps your unoptimized mess run acceptably on a wider range of hardware. And you ship it.

I am pretty sure that I brought this up as a con to DLSS back in the day, and FSR too. That over time whatever performance lift it brought on optimized games would get swallowed by future titles that simply took it for granted and used it to run acceptably. The real benefits? Games can be made more cheaply? Yay.

"Progress."
Yeah yeah yeah, the old shovelware effect. Likely compounded by the other now infamous one; The console effect. But neither DLSS 2.X.X or FSR help smooth things out in Hogsmeade the way FG does when maxed with RT. At least in the benchmark vids I have watched anyways. I don't own the game. Isn't that the kind of thing Danny boy is talking about? I should shut up since I won't even watch the vid LOL!
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,262
5,257
136
I watched Daniel for a short time.

I feel he is a crackpot. His videos seem very geared to pull views.

Not sure which channel this is. I just see a black screen, because it's one of the channels I blocktubed (love this extension). I usually block them for posting nonsense clickbait (like MLID).
 
Last edited:

Mopetar

Diamond Member
Jan 31, 2011
8,004
6,446
136
Not sure which channel this is. I just see a black screen, because it's one of the channels I blocktubed (love this extension). I usually block them for posting nonsense clickbait (like MLID).

I don't know if he also does rumor videos, but the only stuff of his I've seen have been card comparisons showing off in-game benchmark results for various games/settings and a few "what should I buy?" style videos discussing what's a good value product at the moment.

I wouldn't put him anywhere close to MLID.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |