As an owner of samsung D8000 plasma(2010 model) and panny GT60 (2013 model) ill share what I have observed.
The older samsung is more susceptible to image retention and burn-in then the gt60.
What I have noticed with both of them, especially the older samsung, is that image retention is only...
I don't think it has much to do with sizes.
I think it has more to do with the fact that OLEDs degrade with usage(lose brightness over time).
Well, that happens to pretty much all light sources including LEDs in a backlight of a LCD display, so whats the problem?
The problem is that in an OLED...
I like the idea of a 40" 4k monitor with a quality VA panel, similar ppi as 27" 1440p monitors, I would prefer one with variable refresh rates though :(
Aside from resolution a panny plasma will blow any lcd monitor right out of the water in terms of picture quality...
Many of you are forgetting that high res textures eat up a lot of vram, yet have almost no impact on performance(if you have enough vram).
I've encountered 2 games that require more than 2gb vram @1440p
Skyrim with high res textures
aand total war: Rome2
I'd be surprised if we wont see any...
GURU3D measured 87°C with an infrared camera at the back of the pcb through a hole in the backplate.
It is very likely that the VRM's themselves are over 100°C. AND those measurements were made without any additional voltage at stock clocks.
That is not a good sign for those who want to OC.
IMO...
MSI was the only AIB that I know of that did not voltage lock ANY of their 7950/7970 cards.
Plus their custom cards are usually among the best.
Though buying blindly is always a bad idea since every AIB makes mistakes.
It looks like the gtx780 lightning heatskink which has a different orientation of heatpipes/fins compared to the old twin frozr heatsinks.
That is a good design choice since the hawaii die has a non-square rectangle shape so the design should result in better distribution of heat among multiple...
There are many kinds of VA-panels out there but your assumption is generally false. The highlighted dell monitor uses an AMVA panel. The same goes for the two BenQ GW monitors.
VA-panels may not be perfect for colour critical work(professional video/photo editing).
But IMO the most...
High res textures have very little impact on framerate but they have a huge impact on IQ and they will eat up your vram.
this reminds me of the alleged 640k bill gates quote :thumbsup:
From my own testing Rome 2 can use over 3gb vram @1440p. Skyrim can easily pass 2gb with high res...
so if its 30% faster than a quiet 290X, that would make it ~45% faster than a stock gtx 780 @1440p :
As much as I would love for that to happen it just does not sound realistic
What every single reviewer fails to do is stating the average core clocks for each run.
Not just for the 290X but for all cards.
Nobody wants to know the meaningless boost clocks.
These guys review GPU's for a living and they don't bother stating the actual core clocks!? That is just beyond...
290X @ 822mhz:
290X @ 856mhz:
290X @ 852mhz:
Look at those clocks! I think aftermarket solutions will do wonders for this card.
SOURCE
and unlike my 7950 it likes voltage:
Thats a lot of watts though.
Yeah that is what I said :)
But the HD7870 has exactly 2x the specs of the HD7770 and the same clocks except for a tiny 6.7% higher mem clock(1200vs1125).
And according to the graph I posted above, the scaling is almost perfect:
91/46 = 1.978 <--- almost 2x the performance
So the 290(non X) should have exactly 2x the specs of 270X(HD7870) (or 4x HD7770)
1280-->2560 shaders
32---->64rops
256--->512bit
290 @1050/1400mhz should be at 200% on this graph(assuming perfect scale-ing):
ofc those will not be stock clocks and scaling will not be this good but if...
hmmm I have never heard of the PF series. How does it compare to the ST/GT/VT/ZT series?
Ive got 50" GT60 and I love it. In terms of picture quality it completely rapes my catleap. I havent tested it for gaming yet but according to hdtvtest.co.uk it is supposed to pretty fast.
ANSI...
assuming seronx is right:
87.5% more shaders
100% more rops
33.3% more bandwidth @6ghz vs 55.5% more bandwidth @7ghz
even though the 7970 has a lot of bandwidth, that thing might be unbalanced, so 55.5% increase in bandwidth seems like a smart choice.
I was using a frame limiter(radeonpro) in skyrim since the game engine gets buggy at high frame rates. I wasn't using vsync due to input lag.
While using the frame limiter I noticed some horrible (very consistent) tearing(always at the same part of the screen if I remember correctly). I tried...
If the next high end 20nm chip from amd(or nvidia) would be ~400mm^2 it should have 30-45% more transistors than GK110.
If such a chip would only be 10% faster than GK110, which has horrible perf/transistor(when you look at gaming), it would be a disaster.
Hell, a 20nm 300mm^2 chip designed...
Is there anyone who knows which gigabyte 7950's are voltage locked?
These gigabyte models are really cheap where I live so I am probably gonna buy a second one for mining.
I can choose between a 900mhz one and a 1000mhz one. The 1000mhz one is locked right?
Also I have heard you need to...
thanks ,that worked.
However my 7950 does not behave like your 7970. With aggression set to 13 and gpu_thread_concurrency set to 20000 I get ~350khash but with aggression set to 19 I get ~650khash. Thats at 1130/1800(core/mem).
I guess I will just mine btc during the day and ltc at night ;)
are you guys experiencing some serius lag on desktop while mining litecoins with reaper? I cant even watch videos while mining LTC :colbert:
EDIT:
@bluesqueare07
660(non-ti) SLI with 100% scaling would only be ~15% faster than a heavily oc 7950. Two 660 would be 40% more expensive...
40rops and 320bit!? ouch. That should make the titan ~20% faster.
If these specs are true then this card will be under 10% faster than the 7970ghz edition :confused: thats baaad
yeah there is no way my 8+6pin card could do 1.25V on stock tf3 heatsink without reaching dangerous temps. 1150mV was about as far as I could go with the tf3 heatsink. Asic quality is 88,9% and my card does not respond well to extra voltage. The only way for me to reach higher clocks is to lower...
I am pretty sure that does not work anymore. Setting unofficial overclocking to 1 will cause weird flickering.
following this guide worked for me(steps 9-10 is enough): http://forums.overclockers.co.uk/showthread.php?t=18431335
it may look weak without fins of any kind but it has always been the best performer according to infrared thermography :
http://www.hardware.fr/articles/853-17/thermographie-infrarouge-cartes-graphiques.html
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.