Even SB and IB CPU's benefit massively from increased bandwidth on newer games.
http://www.eurogamer.net/articles/digitalfoundry-2016-is-it-finally-time-to-upgrade-your-core-i5-2500k
you gain the same performance from upping memory speeds as you do from OCing the processor in several cases...
It drew over 75watts from the motherboard, because it was drawing over 160 watts in the first place.
You aren't even making any sense anyway. I'm very interested in numbers, numbers that are real and measured. Since when has what afterburner reports been taken as real measured numbers...
right, so I'm BSing because I wont buy a 480 to test myself, when there are tons of reviews on the internet showing Polaris 10's power draw to be north of 160w, to the point that AMD had to publicly address it. But apparently that's not proof, and some Youtube videos of Afterburner readings...
Why would I do that when all of the sites that have measured 480s from the wall have it drawing north of 160w? To the point that AMD had to release drivers to correct it? Suddenly an Afterburner TDP reading is proof to the contrary? What is wrong with you people?
http://www.anandtech.com/show/9266/amd-hbm-deep-dive/4
Here's anands own take on it. There is not even that much power usage to save. if HBM2 used 0 power, it would drop polars TDP by about 30w. 8% more than HBM1 = not a significant amount of power savings.
Like I said, roughly 15-17...
110w? lol. The 480 pulls north of 160w at stock speeds. HBM2 would save, at most, 15w off of that. A 1080 is around 1.8x faster than a 480 and a GP102 is 2.3x or so faster. You do the math.
If AMD pulls magic out of a hat, they'll have a GP102 competitor with about a 300w TDP. Vega was...
No, I think the small vega is going to compete with the 1070, and the big vega with the 1080. AMD doesn't have the TDP headroom to make a GP102 competitor.
The banding present in 10bit displays is ultimately due to white-balance and gamma accuracy. The white-balance charts posted on rtings for example, are just 10 pt readings. While this seems like it makes sense, since most sets only have 10 pt WB controls, it doesn't mean that a televisions...
This is exactly what was expected was it not? AMD releases Vega 10 with a 250w TDP and HBM2, just to compete with Nvidia's small GPU.
The sad part is that the HBM2 is probably there just to keep TDP down as much as possible.
But we're nowhere near the upper part of the CIE range with DCI-P3, which is why the impact is very noticeable.
I think you need to watch some UHD content. Grand Tour on Amazon for instance; you will immediately notice the wider gamut.
That's not true at all, gradients can easily be seen on 8bit content if you go much higher than 100 nits. HDR content is mastered or generally displayed at 1000+ nits, which is why the 10 bit gradient is important. Content mastered and displayed with a 1000 nit ceiling would have all sorts...
you aren't watching youtube videos in HDR. Only the chromecast ultra or a Samsung TV can display Youtube HDR videos for the time being. I also have not found an Avera tv that supports HDR10 or Dobly Vision, which means that whatever it's calling "HDR" isn't the real thing.
I'm sorry, but you're smoking the finest crack available for consumption if you think you don't need AA at 1440p.
Personal attacks are not allowed.
eswuared
Anandtech Forum Director
"Every title I play can do 4k at 60 fps. The problem with benchmarks is that they test games made in the last 4 years, I just play old games instead. "
:rolleyes:
I bought one of Overlords Korean 1440p monitors about 2 and a half years ago for $450. It does 120hz with no problems, and the uniformity on it is absolutely perfect. While freesync/gsync and/or backlight strobing and 4K are features id like to have (they dont work together from what i know...
Don't buy a spider. The meter you want is the i1 Display Pro, or some variation of it.
Any colorimeter, including the Spider or I1 Display Pro can be used with ColorHCFR to take readings. If you want to take things to another level, if you use an HTPC on your television, you can have...
what do you have your brightness set at? I've found that i get eyestrain unless i have mine turned down significantly. Lower refresh caused eyestrain on CRT's, but LCD's are persistent, so the refresh rate isn't generally going to matter in terms of eyestrain.
This mentality makes no sense. If you bought a 970, you are not the type of consumer that will buy a new GPU a year later. Price points change yearly and there is no reason to expect a significantly faster GPU for the same price a year and a half from the original launch. There was a modest...
I blame it on AMD fanboys. They all so desperately want AMD GPU's to be good, so they argue about how power efficiency doesnt matter and how AMD GPU's are "gonna be good" eventually when DX12 finally takes off a year from now. It's irritating.
AMD can't compete with a GP102. They're going to have to use HBM2 just to compete with the 1080. and only because it would otherwise have too high of a TDP. Pascal is only in Maxwell territory perf/watt wise. That is not good.
That has almost nothing to do with HBM. 390 is GCN 1.1 while fury was gcn 1.2. HBM saved like maybe 15-20 watts at most. Also, the Fury had good perf/watt partially due to the shaders being choked out by the front-end and keeping it way under-utilized.
It will probably hit the same clock speeds as a 1080, but they set the stock/boost clocks lower to keep the TDP down. Clock for clock i bet its 1.4x faster than a 1080, and i bet it is perfectly capable of hitting the same clock speeds assuming the vbios allows a high enough power limit.
I've noticed DPC latency problems since at least the 7xx series. This problem people are saying exists with the 1xxx is no different that what I've seen the entire time I've been using Nvidia cards since Kepler. I've had SLI the whole time so I don't know if that was a factor. I've never had...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.