Silly Q...
Where can I spend these BitCoins if I'm from the UK?
http://www.spendbitcoins.com/ Is what I have used, and it looks like they can give you UK amazon gift cards if that is where you want to shop. Haven't used them since they updated the site and added the "tool" though.
Or you can just transfer into local currency at one of the exchanges.
What are your temps like? if they're low enough it might make sense to mine for heat and a little money.
Is it even worth it to mine right now on a GTX295? Heating my room with my GTX295 is cheaper than using my old 70's electric base heaters.
Only getting 100Mhash/s
Generally no, but if your electricity is cheap enough... maybe?
At my electricity cost ($0.1215 per kwh), at current difficulty, and current bitcoin value, 1.04W per mhash is the break-even point.
So if your card is putting out 100 mhash, and increases your power usage by less than 104, I suppose it could be worth it. Possibly a bit more if you want to account for the free heat. My understand is that the GTX 295 uses nearly 300W at peak usage, so it doesn't look like it'd be profitable... unless you pay a lot less for electricity.
I pay _a lot_ for electricity especially in the winter when heat is on. I just ran for 2 hours and have 0.007 bitcoins generated :whiste: seems like a waste.
Seriously, if you heat your house with an electric heater there is 0 reason not mine bitcoin, even if nvidia cards aren't as efficient as AMD. If your rig uses 200W or 1000W, it's still converting 100% of that power into heat. You're paying right now to heat a giant resistor, might as well get some work done while you're doing it.
I don't know enough about bitcoin mining to compare it to an electric space heater. But my intuition tells me there is a little bit of convoluted logic in that statement.
Mr. Teal is absolutely right. 1000W space heater or 1000W computer both generate the same amount of heat. Law of conservation of energy.
First of all, it's a dumb statement because space heaters and graphics cards use the voltage differently. A space heater sends the current through wires which are spread out to dissipate the heat better. A GPU's heat is generated from a small chip, and you would have to modify it.
Second I wasn't referring to the thermal properties of a GPU, by "logic" I was referring to the statement "there is 0 reason not to bitcoin."
There are plenty of reasons not to bitcoin. It uses a tremendous amt of energy from what I hear. I don't even know how much is needed to even get some real financial return. But above and beyond that, you have to weigh the benefits against the cost of energy being used.
I hate to make an appeal to authority, but as an electrical engineer I can tell you that you are just wrong on that. The heat density of the GPU doesn't matter, it still impart the same amount of energy into the air. A space heater uses a much larger element because they're passively cooled, and they therefore require more surface area to keep temperature under control.First of all, it's a dumb statement because space heaters and graphics cards use the voltage differently. A space heater sends the current through wires which are spread out to dissipate the heat better. A GPU's heat is generated from a small chip, and you would have to modify it.
I didn't say there was 0 reason not to bitcoin, I said there was no reason to leave a GPU idle when you're running an electric heater. When you're mining, you're converting electricity into heat with basically 100% efficiency. When you run an electric heater, you do the same. The difference is an electric heater provides no additional benefit.Second I wasn't referring to the thermal properties of a GPU, by "logic" I was referring to the statement "there is 0 reason not to bitcoin."
There are plenty of reasons not to bitcoin. It uses a tremendous amt of energy from what I hear. I don't even know how much is needed to even get some real financial return. But above and beyond that, you have to weigh the benefits against the cost of energy being used.
AMD. If your rig uses 200W or 1000W, it's still converting 100% of that power into heat. .
nope... that's completely wrong.. unless it's on fire.. then you'll also get that caustic smoke too.
heat from a computer is wasted energy... if you're converting 100% of that power from the computer to heat.. i'd be calling 911 NOW...cause it's on fire.
heat generated by a computer is wasted energy, not an efficiently running computer.
Saying that a computer converts 100% of it's power into heat is saying that a hotter computer runs better. pentium4 anyone?
that's why we cool computers... not fry eggs with them.
MrTeal, I understand that you are an EE. and this is by no mean a personal attack. But as an EE you also know that there are several different concentrations in EE. Some EE do rf, some code, some do signal processessing... some even become patent examiners (but once they do that, they forget how to be EE... they're just examiners).
well.. I'm an electrical engineer too... i don't like to mention that because most people just get a blank look on their face or think i do wiring in houses (had a recruiter call me one time for an opening as an electrician... i told her that i was an Electrical Engineer and she said "What's the difference?" i just hung up at that point).. well anyways...I have to dumb it down for most poeple and just say, i design the internals of your phones and computers... they still don't get it, but they get a better idea... my nephew thinks i'm in IT
well... my MSEE is in microelectronics and solid state semiconductor devices. I did the thesis route. I design PCBs, IC chips, and circuits... and still dabble in research.
That's why i know that if your computer is converting 100% of it's power to heat... IT'S ON FIRE
What have i started?
I think MrTeal is basically right.
In some situations, even waste heat is doing something useful, though, such as heating a home during the winter months. But we're rapidly reaching the end of winter, and soon that waste heat will run counter to people's air conditioning systems....
People still do not talk enough about knock-on effects of running video card nonstop, whether its burning out the fan faster or heating up the rest of the case more than normal.
that's my point... it's not 100%.. i did explain it better in my last thread... but you one posted up on me.. but if you actually read my post that your are refering to.. there's no thesis there (just that i took the thesis route for my masters)So while it's not 100%, its probably well in excess of 95% in most cases.
An argument between people who understand physics and people who don't.