Again your link points out the reason.
Intel does not have the bottomless pockets you believe they have. They have to pick and choose their targets far more wisely than they have been doing up till now.
AMD won't have any high performance chips at 20nm though. I'm not actually sure what it was they taped out last quarter, been scratching my head over that one.
I think the foundries do production differently from Intel. Intel won't start high volume until yield reaches a certain high-point but the foundries continue to ramp it over a number of fabs over time. I believe TSMC has already started on 20 SoC in two fabs with another couple coming soon.
It's only been one node, and FinFET's may have been a one-time hit at 22nm. There's a whole bunch of other reasons why Broadwell might not clock so well though.
Altera to switch 14nm chip orders back to TSMC.
http://www.digitimes.com/news/a20140304PB200.html
Oh, and 450mm has been pushed out another 5 years - https://www.semiwiki.com/forum/content/3241-intel-450mm-delayed-until-2023.html
Based on that you can expect around 30 fps on "performance" setting so it's probably reasonably playable on low settings. There are likely to be better and cheaper options though.
It's the obvious choice because it's heavily subsidized yes - just like the original 7" with Tegra 3. The current Nexus 7 has sold poorly because non-Apple tablets long since became commodity items.
I believe there is a good chance of the Z3740 in the Nexus 8 but it'll have to be cheap as dirt...
Again - it's only "contrived" because certain API's can't handle the effect.
On high-end hardware and Mantle, the effect is noticeable and a clear upgrade on existing blur effects. Crossfire 290X's will have few problems with it for example - neither will next gen single cards.
What you...
In the end it's about quality and performance. There is a clear quality gain for a large performance hit. If there is performance to spare then isn't it worth it?
For me it's exactly the same as TressFX, and like TressFX 2.0 which has improved in both IQ and performance, StarSwarm's motion...
I thought the motion blur effect was extremely good actually - far better than the crap that is in most games.
Granted it's a very expensive effect and I'm not sure it's worth it yet, but in quality terms it is far better, for me.
You are aware that Intel17 *is* Ashraf right? And every time you link one of his posts and somebody clicks it, he gets that bit closer to his 10 bucks per 1000 views?
The 32nm process AMD has been using for Bulldozer etc is the exact same 32nm process they'd have been using had they not sold the fabs.
Had they not bought ATI they'd have been left without Jaguar, Kabini, Trinity etc. Just Bulldozer/Vishera on that same 32nm.
You're right they wouldn't have...
I should add that it's likely that MS won't just copy the CPU multi-threading for DX12. There's little reason that I can see for them not to copy the memory model as well, for example.
It will depend on a lot of different factors.
BF4 wasn't the best example to start with for sure. It wasn't coded for Mantle from the ground up (it'll probably be a year before we see any games like that) and we all know it was beset by stability issues of which fixing rightly took precedence...
Mantle is more than just CPU multi-threading support.
Reduction of command buffers submissions
Explicit control of resource compression, expands and synchronizations
Asynchronous DMA queue for data uploads independent from the graphics engine
Asynchronous compute queue for overlapping of...
If DX multi-threading was so simple then why did they get it so wrong before? Where is it now? Microsoft doesn't even have anything remotely close to a DX12 beta - let alone any games ready for developing on it.
Mutant - stands to reason that if devs are currently willing to code specifically for Mantle *now*, then more devs would be more willing to when half the job (CPU multi-threading) is already done in DX12.
OCGuy - because Mantle is still in beta and I've never yet seen any bug-free software...
Well considering the sdk hasn't been released yet and it's still beta software, it's asking a bit much. I don't see why AMD should detail how Nvidia can make use of it anyway?
All of the devs who have access to it so far have said that there is nothing inherently preventing Nvidia from using...
You know if DX12 basically copies Mantle's multi-threading - and why wouldn't it - it's just going to make Mantle even easier to implement for devs.
There's nothing much stopping Nvidia cards from running on Mantle already remember. They just wouldn't get the GCN-specific benefits - and that's...
There is no way that the current DX with extensions will ever be able to match Mantle's true CPU multithreading. It will always fall down in certain scenarios.
There is no way that DX will be able to fully support both Nvidia's and AMD's entire capability, especially going forward. To get the...
And what do you think has given Microsoft and Khronos the hurry up? This is happening BECAUSE of Mantle.
You know what they say about putting lipstick on a pig.
Even if AMD did react to the mining craze, they'd only just be getting the chips back now.
The console talk is that the PS4 supply issue will be over my April. It's possible - though imo very unlikely - that AMD is the cause of it but if so it'll be over soon anyway. I just doubt it very much...
This is basically why you have to take TPU's numbers with a huge grain of salt. I knew there were plenty of other things that were whacked but couldn't be bothered looking for them all.
W1zzard really needs to get his act together on his benchmarking runs.
Yes, desktop share went up 1.8%. It's mobile that is cratering, for two reasons.
1) They are very weak in mobile.
2) Something had to give to make way for the consoles.
There are 5 million consoles being sold every quarter that aren't being counted.
This, basically. I'm not 100% convinced yet that AMD will sell the R7 265 at $150. You can find the 260X for less than $120 though, regardless of what nearly every review stated.
To be frank, there are smarter options than that.
http://www.newegg.com/Product/Produc...82E16817139049
$30 and...
The 750 Ti will have a clear advantage regarding passive cards and heat being pumped into the case of course - now THAT is something that is a worthwhile advantage in a mini-ITX build.
The 750 Ti has been measured to draw just over 60W during gaming. Getting the rest of the system to only draw another 20W seems highly unlikely.
The card still needs at least an i3 to push it otherwise you'd be better off with something like a 7750 instead. That's still at least 40W+ during...
Yes I do get it. There are two parts to perf/W but for some reason certain quarters only seem to want to concentrate on the W part. :rolleyes:
Let's use your numbers (without adding the extra 4W like you have)
Power
750 Ti: 124W
265: 170W
FPS
750 Ti: 49,9
265: 63,3
Overall
750 Ti =...
When discussing platform perf/W, yes. System perf/W is the TRUE perf/W.
50-100W? The R7 295 draws between 30 and 45W more on average. That leaves the 750 Ti advantageous to those people with...I dunno 200W PSU's?
Oh so now we should change the reviewing setups? We could always go back...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.