Maker Pro
Maker Pro

5mm LED Voltage Help Needed

warhawk373

Dec 12, 2017
5
Joined
Dec 12, 2017
Messages
5
So I'm building my own trackir clip, which simply consisted of three 5mm infrared LEDs hooked up to a power source. The device's effectiveness is based on how bright and how wide of an angle the LEDs are. The problem is that it takes a couple hours to build so I don't want the LEDs to burn out quickly. I wan't to hook them up to a USB so that I don't have to worry about batteries dying causing the LEDs to become dimmer. The website says the forward voltage is 1.2-1.4v. Well a USB puts out 5v. I have 33ohm resistors because I was going to put them on AA batteries.

So after that explanation, how can I make the LEDs last a long time while still making them bright? I have been using a parallel circuit.

I hooked up one LED to a AA battery with and without a resistor and the one without the resistor could be picked up by the camera even when looking 45 degrees AWAY from the camera, but the one WITH the resistor (same LED though) was not able to be picked up once it hit about a 90 degree angle on the camera.

P.S. I am a super noob when it comes to electronics. I just saw a tutorial on the web about making this and decided to go for it because my dad already has a soldering iron.

Also here is the specific LED that I am using. I read that I need the data sheet but mine didn't come with one. https://www.amazon.com/Gikfun-Infra...TF8&qid=1513057096&sr=8-10&keywords=940nm+led
 
Last edited:

Harald Kapp

Moderator
Moderator
Nov 17, 2011
13,653
Joined
Nov 17, 2011
Messages
13,653
An LED is a current controlled device. The voltage across the LED depends on factors like temperature, type of LED, current etc. You need to limit the current through the LED. See our ressource, please.

the one without the resistor could be picked up by the camera even when looking 45 degrees AWAY from the camera, but the one WITH the resistor (same LED though) was not able to be picked up once it hit about a 90 degree angle on the camera.
The resistor limits the current and thus the brightness of the LED. Without resistor the LED won't live long and die shining very bright.

decided to go for it because my dad already has a soldering iron.
That's a good one ;)
 

warhawk373

Dec 12, 2017
5
Joined
Dec 12, 2017
Messages
5
An LED is a current controlled device. The voltage across the LED depends on factors like temperature, type of LED, current etc. You need to limit the current through the LED. See our ressource, please.

That helps a lot. I'm still a little confused though.

So my LED's are rated at 1.2-1.4 Vf with a current of 100mA. So the equation to determine required resistance for the three LEDs should be R=(5-3.9)/0.1 = 11 (5 volts from the USB - total Vf of the 3 LEDs)/mA current rating for that type of LED.

Is all that correct? Does that mean I only need an 11ohm resistor on each LED? If so, is it fine to use the 33ohm resistors? When running on two 1.5v AA batteries the three 33ohm resistors didn't make the lights too dim and that power supply was 2v less than the USB I plan on using. According to that resistance equation I shouldn't have even been using resistors, I should have needed a larger supply of power.
 

Harald Kapp

Moderator
Moderator
Nov 17, 2011
13,653
Joined
Nov 17, 2011
Messages
13,653
Does that mean I only need an 11ohm resistor on each LED?
Partly, not quite fully.
When calculating "R=(5-3.9)/0.1 = 11" you assume the three LEDs in series, whereby you make economical use of the available 5 V supply. In consequence you put 1*11 Ω resistor in series to the string of 3 LEDs. not 3*11 Ω.
If you were to connect the LEDs in parallel, then youd have to provide a separate resistor for each LED, but at a much higher value to drop more voltage.

See this schematic:
upload_2017-12-12_8-34-43.png
 

warhawk373

Dec 12, 2017
5
Joined
Dec 12, 2017
Messages
5
Partly, not quite fully.
When calculating "R=(5-3.9)/0.1 = 11" you assume the three LEDs in series, whereby you make economical use of the available 5 V supply. In consequence you put 1*11 Ω resistor in series to the string of 3 LEDs. not 3*11 Ω.
If you were to connect the LEDs in parallel, then youd have to provide a separate resistor for each LED, but at a much higher value to drop more voltage.

So if I'm using a series circuit then I use one 11ohm but if I'm using a parallel then I use three 37ohm [(5-1.3)/0.1]?

That's where I was confused because I couldn't understand if the amps are evenly dispersed between the 3 LEDs in a parallel circuit. And from what I've read, a series circuit seems to be the wrong way to go when linking LEDs because of the amps becoming unevenly distributed causing one light to consume more or less power than the others.
 
Last edited:

Harald Kapp

Moderator
Moderator
Nov 17, 2011
13,653
Joined
Nov 17, 2011
Messages
13,653
So if I'm using a series circuit then I use one 11ohm but if I'm using a parallel then I use three 37ohm [(5-1.3)/0.1]?
Exactly.

a series circuit seems to be the wrong way to go when linking LEDs because of the amps becoming unevenly distributed
On the contrary: In a series circuit the same currrent flows through each component. An uneven power distribution will result from slight differences in parameters of the LEDs. You will probably not be able to notice this in your application.

The same reasoning applies to the parallel circuit with separate resistors. Each "leg" of the parallel circuit has the same voltage across, therefore within each leg the equation I=(5V-Vled)/R is valid and the three paths are balanced.

the amps becoming unevenly distributed causing one light to consume more or less power than the others.
This happens when you wire the LEDs in parallel with only one series resistor:
upload_2017-12-12_10-12-10.png
Here the LED with the lowest pass voltage will draw the most current, leaving less for the other LEDs (with higher pass voltage), thus creating an imbalance in current through the LEDs resulting in an imbalance in light output.
 

warhawk373

Dec 12, 2017
5
Joined
Dec 12, 2017
Messages
5
In a series circuit the same currrent flows through each component. An uneven power distribution will result from slight differences in parameters of the LEDs. You will probably not be able to notice this in your application.

Ohh, well I have a small electronics store here. I may go buy an 11ohm resistor and try out a series circuit instead. Thanks :)
 

Harald Kapp

Moderator
Moderator
Nov 17, 2011
13,653
Joined
Nov 17, 2011
Messages
13,653
If you don't find an 11 Ω resistor, use 12 Ω. The value is not that critical here but should not be less than 10 Ω to protect the LEDs from overcurrent.
 

Audioguru

Sep 24, 2016
3,656
Joined
Sep 24, 2016
Messages
3,656
That helps a lot. I'm still a little confused though.

So my LED's are rated at 1.2-1.4 Vf with a current of 100mA. So the equation to determine required resistance for the three LEDs should be R=(5-3.9)/0.1 = 11 (5 volts from the USB - total Vf of the 3 LEDs)/mA current rating for that type of LED.

Is all that correct? Does that mean I only need an 11ohm resistor on each LED? If so, is it fine to use the 33ohm resistors? When running on two 1.5v AA batteries the three 33ohm resistors didn't make the lights too dim and that power supply was 2v less than the USB I plan on using. According to that resistance equation I shouldn't have even been using resistors, I should have needed a larger supply of power.
The IR LEDs are very cheap therefore they have poor quality. The spec's say nothing about current but an answer to a question says 10mA (not 100mA) and at 10mA then 10% of them will burn out soon.
For 3 LEDs in series then the resistor is (5V - 3.6V)/30mA= 46.7 ohms. If the LEDs are 1.4V then with a 47 ohms resistor the current will be only (5V - 4.2V)/47 ohms= 17mA which is only 5.7mA for each LED.

I think you need better LEDs that survive much higher current. Ebay and Amazon sell cheap Chinese crap.
 

warhawk373

Dec 12, 2017
5
Joined
Dec 12, 2017
Messages
5
The IR LEDs are very cheap therefore they have poor quality. The spec's say nothing about current but an answer to a question says 10mA (not 100mA) and at 10mA then 10% of them will burn out soon.
For 3 LEDs in series then the resistor is (5V - 3.6V)/30mA= 46.7 ohms. If the LEDs are 1.4V then with a 47 ohms resistor the current will be only (5V - 4.2V)/47 ohms= 17mA which is only 5.7mA for each LED.

I think you need better LEDs that survive much higher current. Ebay and Amazon sell cheap Chinese crap.

Oh I didn't see that. If that answer is correct that would mean I need a 110ohm resistor in a series circuit wouldn't it? (5-3.9)/0.01

According to that LED tutorial thing, the mA stays the same as a single LED when calculating the resistance for a series circuit.
 

Audioguru

Sep 24, 2016
3,656
Joined
Sep 24, 2016
Messages
3,656
Of course the mA stays the same as a single LED when they are in a series circuit, they all use the same current.
Your LEDs "might be" 1.3V but they are spec'd to be from 1.2V to 1.4V. Then your 110 ohm resistor, 3 LEDs in series and a 5V supply produces a current from 7.3mA to 12.7mA and will have a low range.
 
Top