T1beriu
Member
- Mar 3, 2017
- 165
- 150
- 81
Last edited:
Time to walk back the overclock?I hope Vega launches soon, my R9 290 is starting to artifact
Please hold on <3
For f***'s sake I hope AMD gets the voltage dialed in right this time. There's no need for another RX580 situation where most chips can be undervolted by about 0.1v or more with a corresponding >20-30w power drop and therefore be run at top boost clocks all the time with efficiency nicely going up... save for the balls to the wall 1450MHz out of the box models that actually need the voltage. I understand this is done to maximize usable chips at x frequency, but come on. It hurts them more than it helps when it's review time.
Yeah AMD has been terrible about that. Hawaii, Fiji and Polaris are all overvolted out of the box and it really kills their efficiency.
Time to walk back the overclock?
Quake Champions + Vega will definitely move some product... if the price and performance are right. I'm really hoping they pull a Ryzen on NVIDIA.
$349 for 1080FE performance would be sweet, $499 for 1080Ti perf.
(let me dream)
The difference is that Nvidia didn't sit on its *** like Intel.
I don't agree with Nvidia prices, but I do have to say that Nvidia isn't slacking.
Intel wasn't exactly sitting on its petard. They've more or less hit the wall in thermal design and, well, "no one told them it wasn't easy!" to move from 16, to 14, and to 10nm. At least, they are far closer to physical limits in design than nVidia or AMD will soon be when compared to GPU uArch.
...........
Suggesting Intel has 'hit the wall' in thermal design is a ludicrous suggestion, please educate yourself on this subject before posting nonsense.
Intel's mainstream parts (which are far more popular/affordable than their HEDT chips) still use terrible thermal paste to transfer heat between the die and IHS. They could use solder, which would greatly improve the thermals (and thus performance) of their mainstream parts (educate yourself on the benefits of delidding 7700k's for example).
Yes, but go from 4 cores up to 8 cores, and see where they get even with soldering. That's a lot of extra heat.
Intel's mainstream parts feature a maximum of 4 cores, not 8. Read the post your quoting.
...........
Suggesting Intel has 'hit the wall' in thermal design is a ludicrous suggestion, please educate yourself on this subject before posting nonsense.
Intel's mainstream parts (which are far more popular/affordable than their HEDT chips) still use terrible thermal paste to transfer heat between the die and IHS. They could use solder, which would greatly improve the thermals (and thus performance) of their mainstream parts (educate yourself on the benefits of delidding 7700k's for example).
We can definitely conflate "sitting on its ass" with "milking the customer". To us consumers these basically amount to the same thing. nVidia has most definitely sat on its ass for a few generations by intentionally not releasing its fastest to market. I know the insta-line people spew is that why should they given a lack of competition.. but their internal incentives for doing this are a moot point when we know for a fact they do this in the open marketplace. Case in point, they released the basically unattainable $1200 Titan X when everyone and their mother knew a $700 (still very profitable) 1080ti was planned all along. This is them de facto sitting on their asses, while not born from complacency, its born from corporate profit motives.. same result.The difference is that Nvidia didn't sit on its *** like Intel.
I don't agree with Nvidia prices, but I do have to say that Nvidia isn't slacking.
...........
Suggesting Intel has 'hit the wall' in thermal design is a ludicrous suggestion, please educate yourself on this subject before posting nonsense.
Intel's mainstream parts (which are far more popular/affordable than their HEDT chips) still use terrible thermal paste to transfer heat between the die and IHS. They could use solder, which would greatly improve the thermals (and thus performance) of their mainstream parts (educate yourself on the benefits of delidding 7700k's for example).
So, I'm going to put out a bold prediction Vega reviews tomorrow!
There was that tweet from Ryan Smith posted earlier.
nVidia released a new driver today so any online troll/fanboy/youtuber/viral marketer has the latest performance improvements for popular benchmarks available for them.
http://fudzilla.com/news/graphics/43481-nvidia-releases-geforce-381-89-game-ready-driver
A super moderate posting (probably) a silly question. Almost as if taunting us...
So there. Maybe there's a pattern here or maybe I'm just off my meds...
We can definitely conflate "sitting on its ass" with "milking the customer". To us consumers these basically amount to the same thing. nVidia has most definitely sat on its ass for a few generations by intentionally not releasing its fastest to market. I know the insta-line people spew is that why should they given a lack of competition.. but their internal incentives for doing this are a moot point when we know for a fact they do this in the open marketplace. Case in point, they released the basically unattainable $1200 Titan X when everyone and their mother knew a $700 (still very profitable) 1080ti was planned all along. This is them de facto sitting on their asses, while not born from complacency, its born from corporate profit motives.. same result.
Well, I sure hope Intel wasn't sitting on a French 16th century door breaching charge. That sounds rather dangerous. Not that I quite understand why it would own one in the first place, though.Intel wasn't exactly sitting on its petard.
For one, you can just add more cores to make GPUs faster, which means every node shrink really is a doubling of performance at the same power. GPU architecture is fundamentally different from CPU. You can't just make that comparison without considering the differences.One the CPU side, look at how little things have improved since Sandy Bridge.
Then look at the GPU side, and see how things have improved since Fermi
One the CPU side, look at how little things have improved since Sandy Bridge.
Then look at the GPU side, and see how things have improved since Fermi
Embarrassing if true.