Why does that computer look like a trash can?
Sucks to be you.
No AA at all with the games with the highest frame rates and No AF?
0xAF? in 2015? is this for real?
No, sucks to be AMD. He'll just buy one of Nvidia's multiple options that supports HDMI 2.0.
AMD doesn't seem to have gotten the message that, as a minority player, they don't get to dictate to the market what it does and does not need.
I've been using one for a while but it is not that great in fast moving games, not terrible but a monitor is much better. I'm sure the newer sets have improved but so have monitors. I don't think many people will game a 4K set but still if AMD doesn't support HDMI 2 it would an odd omission.We use a TV for gaming at my place.
I've been using one for a while but it is not that great in fast moving games, not terrible but a monitor is much better. I'm sure the newer sets have improved but so have monitors. I don't think many people will game a 4K set but still if AMD doesn't support HDMI 2 it would an odd omission.
What? Why would I not use the 4k TV for 60 hz gaming big screen? Are you kidding me?Oh cry me a river.
1) TVs are not for gaming. They are terrible for gaming.
2) So since you're not gonna be using the TV for gaming, especially not 4K gaming, you're not gonna need the 60Hz at 4K, which is practically the only difference between HDMI 2.0 and HDMI
1.4.
Bonus: HDMI 2.0 does not allow for Freesync.
HDMI 2.0 is outdated and obsolete.
Nvidia voluntary PR machine in full power.
Let Fiji bashing begins!
Where?
Seems pretty chilled out in here to me?
It is all in your head.
Why the outrage.?.
Did you ever see their competitor provide the exact settings of their perfs claims..?.
In 2015.???..
Cards are awesome(unless R300 rebrands) and someone(not you) are already trying to find a reason to call it flawed.
Oh cry me a river.
1) TVs are not for gaming. They are terrible for gaming.
2) So since you're not gonna be using the TV for gaming, especially not 4K gaming, you're not gonna need the 60Hz at 4K, which is practically the only difference between HDMI 2.0 and HDMI
1.4.
Bonus: HDMI 2.0 does not allow for Freesync.
HDMI 2.0 is outdated and obsolete.
I utilize a 1080p projector for my gaming area, mainly because consoles are only 1080p. I do have a PC connected up to it for games like witcher 3 that run a lot faster on PC.
I will game at 4K in a triple projector setup so I need to figure out if FURY X2 or 980TI x2 will do it, looking at preliminary 980TI x2 stats, it will not do.
I would rarely want to play my pc games on a TV and I don't plan on having that setup anytime soon. The 4k market isn't that big right now either way, especially for gaming. The standard seems to be 1080p most of the time.
Cards are awesome(unless R300 rebrands) and someone(not you) are already trying to find a reason to call it flawed.
Ok so you're telling us we should all sell our tvs and buy models with display port.If you want a new 4k TV, buy one with DisplayPort. Easy-freaking-peasy.
At any rate, TVs are not made for gaming. The lantencies are huge, and generally not worth it.
How are they "giving up an easy win" when it literally doesn't matter to 99% of their customers? Most people who do 4K gaming do so on a monitor, most likely with DisplayPort.
The 1.4 HDMI works fine for the absolute vast majority of scenarios. It can even run 4K, just at a capped 30Hz, which is fine for movies and such.
just curious, how many buys a tv for their pc? why not a monitor?