Maker Pro
Maker Pro

LED on a DMX decoder help

Malc

Jul 29, 2014
3
Joined
Jul 29, 2014
Messages
3
Hi all!

I'm having a bit of an issue with a system I am trying to implement, I am using a DMX controller which I have used many times before with ribbon led strips but this time I need to use individual "star board" mounted red, green, blue and white LEDs. So my PSU has a 21.6v output and max 1.2A at 24V (it is variable but will only turn down to 21.6v) and I have strings in series of x6 of each colour connected to the output of the DMX controller.

The problem is that the green, blue & white led all have a current of 280mA and voltage at 3.0 - 3.4V but the red has a voltage range if 2.0 - 2.6V. I connected a 22ohm 2W resistor to the string and it gets VERY hot, I have also connected x2 22ohm resistors in series and although not as hot it is still unbearable to the touch, can you advise me what would be best?

I working it out I think I need a 39ohm 3W resistor in series with the string of 6 reds, I haven't got any to hand but is there a better solution before I take my hard earned money and turn it into another small space heater!

Thanks
Malc
 

(*steve*)

¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd
Moderator
Jan 21, 2010
25,510
Joined
Jan 21, 2010
Messages
25,510
OK, so the voltage is 21.6V, you want a current of 0.28A, and you have 6 LEDs with a Vf of (say) 2.2V.

You need a resistor of (21.6 - (6 * 2.2)) / 0.28 ohms. That's 30 ohms. A 39 ohm resistor should be OK.

The power dissipated in the resistor is I²R watts. This is 0.28 * 0.28 * 30 = 2.35W. I would get a 5W 39 ohm resistor.

In addition I would measure the actual current to make sure it's OK, and I would ensure those LEDs are on heatsinks.

The amount of power that will be dissipated in the resistor will be the same for a given current. The fact that they get to a higher surface temperature doesn't mean there's more power. A 22 ohm resistor in this case will reduce the current to about 0.38A. At this current it will dissipate about 3.2W. A 22 ohm 100W resistor would still dissipate the same power, but might hardly feel any warmer. A 22 ohm 1/4W resistor would burst into flame. A 3W resistor might get to a surface temperature in excess of 250°C (resistors can be specced for very high surface temperatures).

Placing two 22 ohm resistors in series would drop the current to about 190mA (0.19A) and the total dissipation would be 1.6W (or 0.8W per resistor). If these resistors are 3W resistors, and you think they're too hot when dissipating about 1/4 of their rated power then one of the following is true:
  1. You're too sensitive
  2. A higher current is flowing than I have calculated
  3. The resistors are a style requiring a heatsink that you haven't provided (unusual for such low power devices)
If it turns out to be (2), then maybe the Vf is less than 2.2V (I assumed 2.2V for my calculations) or your voltage is more than 21.6V, or you've connected things up wrong (resistors or LEDs in parallel?)

The best way to measure Vf is to connect the string up and measure the voltage across the resistor. This allows you to calculate the current, and knowing the power supply voltage you can then determine the voltage across the LEDs, and by division, the voltage across each LED. If you find that you get the following:

power supply voltage = 2V
resistance 38 ohms
voltage across the resistor = 9.1V​

We can calculate:

current = 9.1 / 38 = 0.24 A

voltage across the LEDs = 22 - 9.1 = 12.9V

Voltage across a single LED = 12.9 / 6 = 2.15V​
From that we know that Vf is about 2.15V at 240mA, so if we were aiming for 200mA, we should use a resistor of the value:

(22 - (6 * 2.15)) / 0.2 = 45.5 ohms (we would probably use 47 ohms).
I hope this helps a little.
 

Malc

Jul 29, 2014
3
Joined
Jul 29, 2014
Messages
3
Wow, thanks *steve* that's very helpful! Particularly like the "1. You're too sensitive" point :) I haven't been able to get on it today but I shall make some measurements for Vf tomorrow.

Thanks again, much appreciated
 

(*steve*)

¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd
Moderator
Jan 21, 2010
25,510
Joined
Jan 21, 2010
Messages
25,510
No problems Malc. With knowledge comes power, or in this case the ability to calculate power. If you can confidently calculate the power being dissipated in a resistor then you can generally make a better call than "it feels too hot".

Some components should never get hot, others can get quite warm, and others can get almost alarmingly hot. Depending on the specification for a resistor, it can get very (alarmingly) hot since there's nothing like a sensitive semiconductor junction inside to protect. The finger test works better for transistors because normal ranges for their operating temperature exclude those which cause your finger to sizzle a little.
 

Malc

Jul 29, 2014
3
Joined
Jul 29, 2014
Messages
3
Hi *steve*,

I measured the voltage at the power supply at 21.3V
The voltage across the resistor at 7.5V

Incidentally I measured the voltage across a single LED as 2.0V

So using the formula you shared I get:

7.5V / 39 ohm = 0.1923A

Voltage across LED = 21.3V - 7.5V = 13.8V

Across a single LED = 13.8V / 6 = 2.3V

The specs supplied by the vendor for the LED (although I don't trust them!) is that the LEDs should be 280mW, I am calculating 240mW here:

(21.3 -(6x2.3))/0.24= 31.25 Ohm

I have replaced the resistor I was using with a 3W 39ohm wire wound resistor which still gets very hot, I haven't got a thermometer to measure it so I am just using fingers! and the fact that when covered in a heat shrink, the heat shrink becomes tacky.

With the setup I have at the moment, the red LEDs do not get even warm within their casing/heatsink and I am happy with the amount of light output, so if I were to increase the wattage of the resistor but keep the same resistance, would the end result be a cooler component? But with a less efficient circuit?

Here's the link to the resistor spec:

http://docs-europe.electrocomponents.com/webdocs/01dd/0900766b801dddfa.pdf


Cheers
 

Gryd3

Jun 25, 2014
4,098
Joined
Jun 25, 2014
Messages
4,098
Unfortunately, the power consumed by the resistor will remain at 1.5W even if you put a 5W resistor in the circuit.
Things that can help would be form factor, and cooling. A small part dissipating 3W will usually be hotter than a larger part dissipating 3W (regardless of maximum rating). By increasing the surface area of the resistor, it will dissipate heat more easily and stay cooler. If the heat is a concern, you may need to position it in such a way so that it does not interfere with other components or the housing. Putting it in heatshrink tubing will merely assist in making it hotter ;)
You can use a heatsink to passively assist in keeping the part cooler.
If you want to reduce the amount of wasted power, you will need to either:
A) reduce the voltage drop across the resistor. (Using additional LED's in series will result in a smaller voltage drop across the resistor... using 8 LEDs instead of 6 will reduce the power dissipated by the resistor to about 0.5W with a 15Ω resistor)
B) use a switching power supply.
 

(*steve*)

¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd
Moderator
Jan 21, 2010
25,510
Joined
Jan 21, 2010
Messages
25,510
Something is a little odd because you say you have a voltage of 21.3V, a voltage across the resistor of 7.5V and each LED has 2.0 volts across it.

There's another 1.8V to be accounted for.

One or more of the following is true:
  • The voltage drops under load (measure it under load)
  • Your measurements are inaccurate
  • Your meter has a very low impedance and is affecting the measurements (very unlikely)
  • Significant loss is occurring in the wires and connections
  • You have additional resistance unaccounted for (other than the note above)
  • You have 7 LEDs, not 6.
  • Your 2.0V measurement across a LED was done at a much lower current.
I presume you mean the LEDs are rated for 280mA (not mW) and you've got a calculated current of 240mA. That's fine. I would not run them at 100% power.

Do you have them on a heatsink? (yes -- good!)

Covering a resistor in heatshrink is not a good idea as you're essentially putting a nice warm blankie on it to keep it warm. It is designed to dissipate up to (say) 3W when in free air. If covered in heatshrink, you could easily halve the rate at which heat escapes.

Gryd3 makes the suggestion that you can add another LED to reduce the power dissipation in the resistor. This is true, but it will also reduce the effectiveness of the resistor if the supply voltage varies. This will also lead to an increased risk of thermal runaway (thermal runaway is where the current flowing through the LED heats them sufficiently to reduce Vf, thus allowing more current, which heats them up to reduce Vf to allow more current, ... which spirals out of control until the LEDs become Dark Emitting Diodes (DED)).

If you check the resource on driving LEDs, you'll find that I recommend that you use a constant current source to drive high power LEDs (these are high power LEDs) You're still going to have significant power dissipated, but you'll be kinder to the LEDs, and they'll thank you by living longer.
 

(*steve*)

¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd
Moderator
Jan 21, 2010
25,510
Joined
Jan 21, 2010
Messages
25,510
Unfortunately, the power consumed by the resistor will remain at 1.5W even if you put a 5W resistor in the circuit.

Yes, but a larger (higher wattage) resistor will run cooler because it has (generally speaking) a larger surface area with which to transfer heat to the ambient environment.
 
Top