For me it's simply an unforgivable waste of energy. In these times of climate change when we're apparently doing what we can to limit our energy usage in order to keep carbon emissions down the cryptocurrency network is adding to a global problem and delivering no benefit to society.
They...
Someone with more knowledge than me can probably clear this up but isn't the main danger failing fans? If so then it's a simple and cheap enough task to put some new fans on the card, you'd still be well ahead financially.
It doesn't make any sense to release a new product when your competition are well behind and you can use that time to build more profit from your already paid off current arch.
Cheers :beercheers:
Hadn't thought about going Intel, i'll maybe explore that route instead as sins isn't particularly graphically intensive but it hammers the CPU.
Yeah, I thought about that but it's only really used for playing video and TV and Sins of a solar empire which is very much single threaded. I run it at 4.2 GHz when I'm playing but I figured switching to the AM4 platform might give a nice boost and also give me a decent upgrade path in the future.
What sort of single thread performance gains would BR offer over original Kaveri clock for clock? I have a 7400K in a rather rubbish HTPC that i'm thinking of upgrading but I can't find much info comparing the two unfortunately. Best estimate I can come to is about 15-20% faster CPU...
Northwood was a damned good core in its day, they just took it too far going for clockspeed with Prescott. The unexpected emergence of A64 changed the game.
I might be recalling this wrong but I seem to remember the DoD would basically quash anything like that as they require two suppliers for this kind of thing.
Wasn't this just about Intel licensing GPU patents from AMD? AMD have a cross-licensing deal with Nvidia IIRC so Intel need to license from one of them in order that their GPUs don't infringe.
Forgive me if i've missed something here but there's a problem with your system and it looks to be either the PSU or the graphics card. There's no sense in replacing both if only one of them is broken, try the new GFX card first since you can already get a refund on that.
If you're not already at 4k i'd echo tential and just stick with the 390 for now. Better stuff will be coming out soon and that should drop the price for your upgrade when you do make the switch. Can always put the 390 to work mining for a while too and put that towards the next card.
Will we still see the same kinds of improvements through drivers in the DX12 era though? It seems to me that there's less AMD and Nvidia can actually do on the driver side to improve performance compared to DX11.
A simple wheel and turbine system will not only keep your hamsters children fit and healthy, it also cuts down on electricity for those few extra dollars for the Lightning OC edition.
Waiting kind of makes sense too when there's a good chance of new cards being announced in a relatively short time frame and you won't need to borrow money to pay for it.
If you're living paycheck to paycheck and relying on credit cards then maybe hold off for a few months until Vega/1080ti news comes around. Stick your money in the savings account each month until then and you'll likely get a better upgrade and still have an empty credit card for emergencies.
I'd go with Bacon1. A new monitor plus GPU (not an old generation used one) would be a big upgrade on what you have now. The monitor and GPU can most likely carry over into any new build in the next couple of years saving you a few bucks there and while the CPU might be a bit of a bottleneck...
All well and good but it's a bit late now. The market position is fixed in peoples minds based off the initial test runs which most sites aren't going to rerun. By the time the weight of new reviews has reached a point where it will make a difference Nvidia will have a refresh out and they'll...
Nvidia have made their money on the 1070/80 though, it's not a big thing for them to drop the price and bring in the 1080ti at that price point. AMD on the other hand haven't had a high performance high margin part out so by being so late to the party they're going to lose out on all the high...
Unless the card is using utterly ridiculous amounts of power I really don't care. A difference of 30,40,50,60W is nothing. It's one fairly dim light bulb yet its somehow made out to be a massive selling point and hugely important metric. Price/perf is the key, nail that and it'll sell.
I'd go 1070 if the budget is there. 1060 is a decent card, as is the 480, but in the event of a system rebuild in the next year or so he'll be able to carry the 1070 over saving a few hundred on the build and not being left with a mismatched rig.
Don't think we'll see it for a while, like. They've got the high-end to themselves right now so it doesn't make much sense to cannibalize their own sales when they could save it as a spoiler and still have the next gen ready to rock in time to own that sector of the market again.
The mid to...
One of the true legendary cards. I remember when it came out being blown away by the performance. Following it up with the 8800GTS and 8800GT was just the icing on the cake, a true high end product stack that just blew everything else out of the water.
Just hoyed a few numbers around and the 1080 and 1070 both scale to around 60% fps at 4k vs 1080 whereas the 980ti and AMD cards are in the 65 - 67% range.
Decent performance across the board really with the exception of the 970. AMD need to get something out at the high end though, 1080 and Titan just killing it in every benchmark.
The OP also mentioned using Photoshop which may benefit from the extra vram.
Bacon1 also posted an image earlier in the thread demonstrating that Civ6 will use more than 3GB of vram if it's there.
Anyhow, if he's planning to keep the card for a few years i'd still recommend the 6GB version...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.