 Login Join Maker Pro

### Network # Current Limiting and Heat

#### trondyne

Oct 17, 2012
63
I hope this is not a stupid question... I have read through the LED info thread and didn't see this issue...

When working with LEDs in the last several months I noticed that sometimes but not always when using or trying to use a current limiting resistor in series with the LEDs the resistor in question can become VERY hot.. (resistor/LED calc used) I was advised to use a higher watt rated component but I tried doing this and found that some of the time heat was still a big problem..

Is this possibly due to the type of resistor or something else? The kind of heat present in some of these was totally unacceptable as the placement of components is often in heat sensitive material.

Is running resistors in series the best way to power LEDs? So much heat seems odd and seems to suggest to me a lot of wasted energy.

Thanks,

Jim

Last edited:

#### KrisBlueNZ

##### Sadly passed away in 2015
Nov 28, 2011
8,393
Yes, series resistors do waste energy. You can calculate how much energy using the Power law: P = V I, where V is the voltage ACROSS the series resistor, and I is the current through it (which is equal to the LED current).

Using a larger resistor doesn't change the amount of energy wasted; it just gives you a larger surface area for heat dissipation, so the resistor doesn't get quite as hot.

You want to reduce the amount of power dissipated in the resistor. Assuming you need a particular LED current, the only way to do this is by reducing the voltage across the resistor, which you can do by reducing the voltage of the power source, or increasing the voltage dropped by the LEDs, by connecting several in series.

As a rule of thumb, I would aim to have at least 20% of the total voltage across the current limiting resistor. For example, if you have a series string of LEDs with a total forward voltage of 9V and a 10V power supply, leaving only 1V dropped across the limiting resistor (only 10% of the total voltage), then variations in LED forward voltage and power supply voltage will make a significant difference to the voltage remaining across the resistor, which is the factor that determines the current. So the current will not be regulated quite as well as it should be.

#### CocaCola

Apr 7, 2012
3,635
One way to reduce resistor heat is to use 'dummy' LEDs painted black... Yes, it's a 'sloppy' hack but it will reduce heat overall even though you are simply shifting some heat to the dummy LEDs... vs #### Attachments

Aug 13, 2011
1,114
Following Kris' percentage rule of thumb, which is a good rule though I sometimes go to 10% when using a well-regulated switch mode power supply, as the length of the series string increases, the voltage drop and consequently the wattage dissipated in the resistor also increases. This is why for longer strings (over 12V) I recommend using a current regulator such as an LM317 configured for current. The voltage drop through the regulator can be less (as a percentage) than it would be using a resistor and you can use a tabbed regulator with a heat sink if necessary.

Last edited:

#### trondyne

Oct 17, 2012
63
Thanks for the thoughtful responses.. I am unsure what to do and will have to think about it.. I will normally be working with 5V now and I hope there won't be any heat issues..

This started a while back when I had ordered a bunch of LEDs and for the hell of it I ordered two of the same types pre-wired for 12V.. So after I received the LEDs I went about wiring up the discrete LEDs for 12V.. And wow did the resistors get hot... Enough of those going and you could make a small EZ Bake oven... and wiring directly to the LEDs I would think would get the LEDs hot enough to shorten their life..

So I took a look at the pre-wired LEDs of supposedly the same type, they had the very small resistors connected directly to the LED and wire covered with heat shrink.. I applied 12V power..and after running them for a few minutes the resistors were not hot at all.. Now I never took the heat shrink off to verify what was under there but it seemed it was just a very small (1/8th watt?) resistor.. This baffled me as I tried one resistor after the next, then two resistors in series each half the value, still hot... So I ended up just using the pre-wired and never had a heat issue... So I have no idea what was really going on there... I had thought there was something I was missing....

Last edited:

Aug 13, 2011
1,114
Did you compare the operating current of the two setups?

#### trondyne

Oct 17, 2012
63
Did you compare the operating current of the two setups?

Sadly no...just learning now how to do that. But why no heat? I hate mysteries like this...

Recently when playing around with different circuits I had a very low value resistor maybe 2.7 Ohms connected to 5V and a half watt LED and it got very hot.. I wouldn't have expected that either with 5V such a low value resistor and higher consumption LED.. Maybe because the R value was too low?

If I run into a problem as I go forward I'll post the particulars here..

Last edited:

#### KrisBlueNZ

##### Sadly passed away in 2015
Nov 28, 2011
8,393
Ignoring differences in heat dissipation efficiency, the temperature of the series resistor (above ambient) is proportional to the power it dissipates, which is the product of the voltage across it and the current through it. In your case with the tiny resistor that doesn't get hot, most likely that's because it's running at a low current, with (presumably) an efficient LED. That's why KJ6EAD suggested measuring the current. That's probably the solution to your mystery. The designer made the resistor a relatively high value, to limit the current.

The laws of physics here are pretty straightforward. Power dissipated in a resistance is equal to the voltage across that resistance multiplied by the current through it. Temperature above ambient is equal to power multiplied by thermal resistance to ambient. Thermal resistance is measured in degrees per watt, and semiconductor packages and heatsinks usually specify it.

Last edited:

#### trondyne

Oct 17, 2012
63
Ignoring differences in heat dissipation efficiency, the temperature of the series resistor (above ambient) is proportional to the power it dissipates, which is the product of the voltage across it and the current through it. In your case with the tiny resistor that doesn't get hot, most likely that's because it's running at a low current, with (presumably) an efficient LED. That's why KJ6EAD suggested measuring the current. That's probably the solution to your mystery. The designer made the resistor a relatively high value, to limit the current.

The laws of physics here are pretty straightforward. Power dissipated in a resistance is equal to the voltage across that resistance multiplied by the current through it. Temperature above ambient is equal to power multiplied by thermal resistance to ambient. Thermal resistance is measured in degrees per watt, and semiconductor packages and heatsinks usually specify it.

Thanks yes this makes sense... Initially I was thinking about this in reverse....that less resistance would be cooler but no... #### KrisBlueNZ

##### Sadly passed away in 2015
Nov 28, 2011
8,393
There ARE some cases when a lower resistance will run cooler. It depends on the circuit.

For example, if you connect a power supply to a light bulb and insert a low-value resistor in series, a lower resistor will run cooler. This is because the resistor value doesn't affect the current much (the light bulb is the main component that limits the current); the resistor value mostly affects the voltage across the resistor, and a lower resistance will drop less voltage and will therefore dissipate less power.

The important thing to remember is that heat is proportional to power, which is equal to the product of the voltage across, and the current through.

#### BobK

Jan 5, 2010
7,682
What is the forward voltage and operating current of the LED?

Bob

#### trondyne

Oct 17, 2012
63
There ARE some cases when a lower resistance will run cooler. It depends on the circuit.

For example, if you connect a power supply to a light bulb and insert a low-value resistor in series, a lower resistor will run cooler. This is because the resistor value doesn't affect the current much (the light bulb is the main component that limits the current); the resistor value mostly affects the voltage across the resistor, and a lower resistance will drop less voltage and will therefore dissipate less power.

The important thing to remember is that heat is proportional to power, which is equal to the product of the voltage across, and the current through.

Thanks for that Kris--the math shows the way....

#### trondyne

Oct 17, 2012
63
What is the forward voltage and operating current of the LED?

Bob

It's been a while since the initial issue--I think it was a 2.1V and 25ma LED.

#### BobK

Jan 5, 2010
7,682
That is not a 1/2 Watt LED. That would be 55 mW. And dropping the voltage from 5V should have produced no noticeable heat. If you connected this to 5V with a 2.7 Ohm resistor, I would have expected smoke release.

Bob

#### trondyne

Oct 17, 2012
63
That is not a 1/2 Watt LED. That would be 55 mW. And dropping the voltage from 5V should have produced no noticeable heat. If you connected this to 5V with a 2.7 Ohm resistor, I would have expected smoke release.

Bob
No the instance in question did not involve a 1/2 watt nor a 5V supply as I related earlier... The half watt was with the 5V 2.7 by chance and the 25ma was with the 12V and I don't recall the resistor maybe a 68 Ohm or higher.

The half watters take about 120ma and 150ma peak..

Last edited:

#### BobK

Jan 5, 2010
7,682
Well, I was asking about the one that was producing excess heat, not the one that was not!

So, if you are using 5 V and the forward voltage is 2.7 V then 2.3V/2.7 gives you .85A. To get 120ma, thre resistor should be 2.3/.120 = 19 Ohms. It will then dissapate 0.27W.

Bob

Replies
9
Views
451
Replies
1
Views
1K
Replies
2
Views
1K
Replies
6
Views
791
Replies
0
Views
621