The claymore miner only charges 1% if you're just using it in ETH mode. You also want to look at the watt draw from the wall. The claymore miner uses slightly less power for me and mines about 1.5mh/s faster in ETH mode. For me, that makes it worth it even with the 1% fee. Of course, it's not a...
Paying about $.11 here. I never got the expected decred from the dual miner, and there were always a fair amount of rejected shares too. Because decred uses the GPU memory it gets REALLY hot and the power draw for me was at times 25% more than just mining ETH. Not worth killing the cards for a...
I tried the Claymore miner...definitely better speeds on ETH but it draws too much power and gets WAY too hot to dual mine decred. Not the GPU but the on-card memory and VRMs.
Once I factor in the extra power usage, 2% fee, etc. it looks like it'd be an extra $5 monthly MAYBE if I actually get...
Well, you're talking about four 390s...which is more than just for gaming. I was talking about people making rigs for mining with multi cards...I don't see people making back their investment let alone turning a profit at this point.
Four 390s ($300 each) plus another $300 for mobo/cpu and...
But you have to look at difficulty. Even with a 20% decrease each month... you'd have a hard time maintaining profits in just a couple of months. If you just bought cards you'd never see profit.
https://etherscan.io/charts/difficulty
https://etherscan.io/charts/hashrate
Really nice builds here. I'm still amazed at the amount of money people are investing considering the value of eth/increasing diff not to mention power costs.
Ya, as I mentioned I used the old drivers that people recommend...I actually tried many of them without any luck. Supposedly it has something to do with CUDA or the way Win10 addresses nvidia cards. I've never been able to fix it. I am using 15.12 with my AMD cards.
Any secret to getting nvidia cards and Win10 to play nice? I have a GTX 760 and when I threw it into my old vista system I was hashing at 14mh/s. In Win10 I get about 3.
I tried the CUDA miner along with the 347 nvidia drivers but no joy.
They generally will only use the manufacture date if you don't have proof of purchase, otherwise the warranty should be based on the purchase date.
i don't know if this is relevant to your product: https://www.asus.com/support/images/upload/05922033-28b5-40a3-8ef2-314781c08f4d.pdf
Just CPU and RAM will get you to the BIOS and obviously a PSU to power it all. Video card is not needed so long as the motherboard has a video port on it that will connect it to a monitor obviously.
If this is happening with multiple sticks of RAM I'm thinking you have a bad board at this point or there is an incompatibility with the RAM itself. The only other thing you could try is another brand of RAM.
Is it in the case already? I know it's a huge pain but I had an issue in the past where something was touching the case that shouldn't have been. Took the mobo out of the case and it worked and when I remounted it carefully everything worked. Also pull the CMOS battery and don't just use the...
That's definitely one I'm looking at.
At this point I think it's between these 3, but I'm still not sure which one:
http://www.newegg.com/Product/Productcompare.aspx?CompareItemList=-1%7C13-157-633%5E13-157-633%2C13-128-841%5E13-128-841%2C13-128-844%5E13-128-844
You can see the notes on their own BIOS page: http://www.asrock.com/support/download.asp?cat=BIOS
"Remove SKY OC function." on all their latest BIOS that include it. Various tech sites are reporting it now too.
Ya, looks like I'll have to. I see today that asrock is removing SkyOC from their BIOS...no more OC'ing non-K chips. I really do want USB 3.1 though, otherwise that board looks OK. I'll have to keep looking.
I'm looking for a z170 board to OC an i5 6500 (BCLK). My budget is pretty tight and I'd like to keep it around $130 if possible.
Things I'm looking for:
Ability to OC with BCLK
m.2 support
USB 3.1
support for DDR4 3000 or better RAM
Maybe that's a tall order for the price. But any...
I currently run a Core i7 920 overclocked to 3.3GHz. If I went with an X5650 what would the difference in power draw roughly be? For average use am I looking at a lot of additional power usage?
Well, I've about given up on figuring this one out...I've not been able to make any progress as to why AHCI would cause such a drastic drop in performance.
It's the Intel one...I've also tried others without change. It worked fine just minutes before switching to AHCI so I doubt that's it but I did give it a try.
AHCI allows hot swapping of drives which is why I wanted to keep it...very convenient for removable storage bay/backups. Otherwise I'd probably just go back to IDE. Although it is puzzling regardless.
I know that's what I've found too.
I can't figure it out. IDE gives me a boost of about 50% as you can see from my numbers and nothing seems to change that.
I did see a couple of mentions of similar issues but no one seemed to solve it.
I originally used the Windows one and then updated to the latest Intel driver. The Intel driver did increase speeds by maybe 10% but still nowhere near my original IDE speeds.
I just realized that my mobo was set to IDE instead of AHCI. I switched it over and made the required Windows changes and my drive speeds are much slower now...why would this happen?
IDE mode:
Sequential Read (Q= 32,T= 1) : 174.517 MB/s
Sequential Write (Q= 32,T= 1) : 173.800 MB/s
Random...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.