ShintaiDK
Lifer
- Apr 22, 2012
- 20,378
- 145
- 106
If there are TVs that have DP that would be news to me.
Panasonic got/had some for example. But HDMI is simply nearing its limit it seems.
If there are TVs that have DP that would be news to me.
I agree a single standard is better. But if TV are doing the switch, I hope home A/V amplifier will do at the same pace, it's so handy to have a single cable going out of your deviceThe question is how viable HDMI is in the long run. DP is slowly getting into TVs. And I for one wouldn't mind a single standard for everything.
The 6100 iris that i had (no edram) could play CS:GO and TFT2 1080p on medium to medium high settings, so the new iris with edram can certainly play more than very light casual games and it would be reasonable to look into that in a review.
It's a review of NUCs. You can't even put in a discrete graphics card. They are made to replace people's home computers who don't play games beyond very light casual games.
How do these benchmark scores translate into actual game performance? The good news is that this thing is pretty good at playing games from late in the last console generation at 1080p with a few of the visual bells and whistles turned on. Bioshock Infinite plays at a consistent 30 FPS with the settings on Low or Medium (depending on the scene). Anything older than that (Skyrim, Portal 2) can be played at 1080p with many of the settings turned up. It’s not a bad box for budget or older games, and if you’re looking for something tiny to use with Steam Big Picture Mode or SteamOS it makes a good case for itself.
...If you're a current NUC owner looking to upgrade, the Skylake NUC is a particularly big leap from the first- or second-generation Ivy Bridge or Haswell NUCs. Memory, storage, and GPU speeds are all much increased, though you won’t notice the bump in processor performance very often. If you’re looking for a mini PC with more CPU power, you’ll want to wait for the high-end quad-core Iris Pro NUC that’s coming later in the year.
Real-world gaming and driver issues
How do these benchmark scores translate into actual game performance? The good news is that this thing is pretty good at playing games from late in the last console generation at 1080p with a few of the visual bells and whistles turned on. Bioshock Infinite plays at a consistent 30 FPS with the settings on Low or Medium (depending on the scene). Anything older than that (Skyrim, Portal 2) can be played at 1080p with many of the settings turned up. It’s not a bad box for budget or older games, and if you’re looking for something tiny to use with Steam Big Picture Mode or SteamOS it makes a good case for itself.
The problem, at least as of this writing, is with either the BIOS or the graphics drivers or both. Specifically, newer games (or slightly older games with the settings turned all the way up) either crash or completely freeze up the computer. Trying to turn the settings up to Ultra in Bioshock Infinite, for instance, and the NUC hangs until you hard restart it. Fallout 4 simply refuses to launch under any circumstances.
Some research into the issue suggests that it’s a problem with Intel’s GPU drivers and newer versions of DirectX. Some gamers have gotten around the Fallout 4 issue by forcing the title to run at the DirectX 11.1 feature level, which I was also able to do after replicating the steps in that post. Even so, the game ran like a flipbook at the lowest settings, and it ought to be able to do better.
Downloading and installing a beta graphics driver package from Intel’s site promises to fix some issues, but we still had problems launching Fallout 4 without hacky fixes, and beta drivers can and will cause their own problems. That driver, for instance, fixed the Bioshock Infinite lockup problem, but Fallout 4 still wouldn't launch and after installing the driver the NUC had issues waking properly from sleep mode. Intel says it's aware of the bugs we're experiencing and is working on a fix, but it wouldn't tell us anything about what's causing the problems or when we can expect the update.
Seems like a good NUC to run benches, real gaming on the other hand...
3Dmark, such a good game, remember when I played it in....... oh wait. Just synthetics.
Just check the hundreds of Surface Pro 4 gaming videos on YouTube and you will notice HD 520 is quite capable inside a slim 12'' tablet/convertible, it actually beats 15W Carrizo with dual-channel according to NBC. Pretty sure Iris 540 with more thermal headroom inside a NUC should fare even better, so it's definitely viable for 'real gaming'. I don't think anyone denies Intel drivers still have a lot of room for improvement though.
Did you even read any post above you? That's exactly what we're criticizing.
If by "real gaming" you mean League of Legends, you have that performance for much less years ago.
Iris pro is in the uncanny valley of "too expensive to ignore mobile dGPU competition and less performant than said competition".
Sadly is too late from Intel that AMD and nVIDIA will release Polaris and Pascal and put Intel again in shame.No, this is what I mean:
- Dota 2
- Counter Strike Global Offensive
- Minecraft
- League of Legends
- Starcraft 2
- Guild Wars 2
- GTA V
- Left4Dead 2
- Battlefield 4
- Dying Light
- Smite
- The Witcher 3
- Civilization V
- Shogun 2
- Heroes of the Storm
- TES Skyrim
- Final Fantasy XIV
- Crysis 2
- Diablo III
- Metro Last Light
- Heroes of Newerth
And considering Iris Pro 580 could very well deliver >2.5x this performance in actual games, you might want to do some research before calling Intel iGPUs useless (slideshow experiences) for anything but League of Legends next time.
Don't try to change the goalposts to dGPUs now, no one is disputing dGPUs still offer better perf/$ in notebooks. For some reason (most) OEMs go insane when it comes to Iris/Iris Pro products pricing, even though Intel ARK prices for these chips are usually similar to their bread and butter offerings (GT2).
The Core i3 6100 is easily the best budget-level chip on the market, but the DDR4 memory you'll need to buy for it is a little more expensive.
But as you can see from the benchmarks below, it offers a commanding lead over other budget solutions, including Intel's own last-gen Haswell chips. However, it is worth pointing out that our Haswell test chip runs at just 3.4GHz, vs the 3.7GHz of the new Skylake when ideally we'd have an older Core i3 4170 to test, operating at the same raw clock speeds.
...The bottom line is this: Intel wins in this category because the performance is generally best-in-class, plus there are more upgrade options. A lowly Pentium can be swapped out with an i7 offering an explosive leap in performance - and that can be done with any motherboard and any memory, but if you're looking to future-proof your system as much as possible, putting the money into a decent Z170 (for Skylake) or Z97 (for Haswell) motherboard really is key.
Whether you're overclocking an i5 6500 or an i5 6600K, the results are explosive. In fact, if you have time, check out this analysis of the Core i7 3770K - still a respectable CPU even now. In that video, we compare an overclocked Skylake i5 and find that it's capable of much better performance, even paired with a mainstream graphics card like the GeForce GTX 970. But even running at stock speeds, these quad-core processors produce superb results on the vast majority of games, and if 60fps gameplay is your aim, both are highly viable contenders.
...The success of this varies on a per-game basis, but the trends become clear - the more memory bandwidth available to your system, the faster the CPU can process the game logic. Between 2133MHz to 3200MHz DDR4, we can see anything up to a 14fps difference - and that's before we've even factored in overclocking.
...While we would have liked to have included an AMD alternative, the fact is that Intel are way ahead of the game here - we will need to see AMD's upcoming Zen processors (due later this year) to see if the red team are indeed back in the game. For now, in the mainstream enthusiast sector, Intel is the only viable option.
What's clear is that games depend more on the additional per-core power of the latest Skylake technology than they do on the additional cores offered by the Haswell-E enthusiast chips. Equally clear is that for the most part, moving up from six to eight cores offers only the very slightest of improvements. Without an overclock in place, the 6700K is fastest on all but one game - Crysis 3, where it requires the eight-core 5960X to beat it, a chip that costs three times as much.
Proving that single-core performance is still king is Far Cry 4, where the Core i7 6700K is far, far ahead of the competition - owing to the fact that although it uses up to eight threads, all of them are powered by one dominant core. However, other results are essentially margin of error stuff. Far Cry aside, there is no absolutely conclusive win here, but the bottom line is straightforward enough: it's the cheapest Core i7 processor tested here that actually offers the strongest performance overall.
...So, in purely gaming terms, the Core i7 6700K is the fastest gaming CPU money can buy. However, in other tasks that scale across multiple cores - video encoding, for example - the many-core chips really are tested and can completely blow away the later Skylake chip. Our recommendation is pretty straightforward then - get the 6700K if your system is going to be used as an out-and-out games machine, but do consider the 5820K if productivity is part and parcel of your PC requirements. You may lose a little in terms of gaming performance, but you gain a whole lot more for tasks like multimedia creation and pure number-crunching.
DRIVER VERSION: 15.40.20.4404 &15.40.20.64.4404 - DATE: March 11, 2016
SUMMARY:
This driver fixes some hangs, graphics corruption and functional issues that were seen across a number of games and
applications. A list of the issues addressed is included below. This driver also adds new Beta support for the Vulkan
1.0 API for 6th Generation Intel Core and related processors.
Iris Pro NUC says hello!
https://benchlife.info/intel-finally-launch-skull-canyon-nuc-with-i7-6770hq-03172016/
i7 6770HQ and External GFX support.
Iris Pro NUC says hello!
https://benchlife.info/intel-finally-launch-skull-canyon-nuc-with-i7-6770hq-03172016/
i7 6770HQ and External GFX support.
Jesus, that's ugly. And you'd think you could fit an optical drive in something that size, especially if its a $1000 HTPC.
Who on earth still uses opticals?
I've got a big shelf full of DVDs I'm sure as hell not buying them again on digital, lots of them aren't on Netflix, and I don't want to screw around with a NAS (and the accompanying wasted electricity). Just put an optical in the thing!
Ugly? Yes. Optical drive? Hell no!Jesus, that's ugly. And you'd think you could fit an optical drive in something that size, especially if its a $1000 HTPC.
And considering Iris Pro 580 could very well deliver >2.5x this performance in actual games, you might want to do some research before calling Intel iGPUs useless (slideshow experiences) for anything but League of Legends next time.
Might as well get a laptopJesus, that's ugly. And you'd think you could fit an optical drive in something that size, especially if its a $1000 HTPC.
Iris Pro NUC says hello!
https://benchlife.info/intel-finally-launch-skull-canyon-nuc-with-i7-6770hq-03172016/
i7 6770HQ and External GFX support.
Are you kidding? Look at the Witcher 3 video you linked above. 8fps at low setting @1024x768. If people want to show slide-shows in YouTube...fine for them.
It takes much more than 2.5x to make this remotely playable.
So yes, iGPU is useless unless you want to play games figuratively from last century.
You and your 3 other friends can just buy an external USB optical bay like the 2 laptop users that also needed it
A optical bay would add quite a bit of size on its own. And nobody really uses it anymore. So its more a waste than good to add