Got a couple of questions for you electronic/circuit gurus.
From what I understand, resistors generally reduce the voltage from a source and the higher ohm value it is, the more voltage it shaves off.
Now with LEDs I've read that even if you have say a 3V LED and a 3V battery, it's still better to get a battery with a slightly higher voltage rating and slap on a resistor somewhere in the circuit, even if the resistance is a mere 0.10 ohm. Main reason is to prevent thermal runaway as LEDs will start to draw more amps the longer/warmer they run.
Now what I don't understand is how resistors actually prevent thermal runaway. I thought resistors didn't really do much for limiting amperage draw as whatever your powering decides on how much amps it wants to eat. I'm sure there is some easy answer for this, but I can't seem to find it (that or I'm just an idiot! lol)
Driving two 6.84v/700mA LEDs from a 7.5v 6xAA pack in parellel using two 1 ohm/5w resistors which I think is about right. Got a 2w rated resistor earlier but did the math later and looks like the 2w unit would fry and burn if I used it. Should I play it safe and get a 10w rated resistor? The circuit seems to be creating around 4.8w which is somewhat near the edge of the 5w units I got.
From what I understand, resistors generally reduce the voltage from a source and the higher ohm value it is, the more voltage it shaves off.
Now with LEDs I've read that even if you have say a 3V LED and a 3V battery, it's still better to get a battery with a slightly higher voltage rating and slap on a resistor somewhere in the circuit, even if the resistance is a mere 0.10 ohm. Main reason is to prevent thermal runaway as LEDs will start to draw more amps the longer/warmer they run.
Now what I don't understand is how resistors actually prevent thermal runaway. I thought resistors didn't really do much for limiting amperage draw as whatever your powering decides on how much amps it wants to eat. I'm sure there is some easy answer for this, but I can't seem to find it (that or I'm just an idiot! lol)
Driving two 6.84v/700mA LEDs from a 7.5v 6xAA pack in parellel using two 1 ohm/5w resistors which I think is about right. Got a 2w rated resistor earlier but did the math later and looks like the 2w unit would fry and burn if I used it. Should I play it safe and get a 10w rated resistor? The circuit seems to be creating around 4.8w which is somewhat near the edge of the 5w units I got.