Maker Pro
Maker Pro

Understanding LEDs

J

Jon

Jan 1, 1970
0
I've picked up a couple of yellow LEDs at Radio Shack to tinker around with,
and I've got a few questions...

They are labeled 2.1 volt, 20mA. I recently built a 1.2V - 25V variable
voltage regulator, so I think I have the power supply that I need. At
first, I simply set the voltage on my voltage regulator to 2.1 volts, and
hooked up an LED. Naturally, it got fried in less than a minute. Too much
current. Then, I did a little math, and realized that if I hooked up two
220 ohm resistors in parallel, I could get about 19 mA at 2.1 volts, which
is pretty close to what the LED was labelled. I hooked it up, and it seems
to work.

I double checked everything before I hooked up the LED, and sure enough, I
was getting about 19 mA, as predicted. When I hook up the LED, though, I
get about 2 mA through the circuit. Also, when I measure the voltage across
the LED, I get about 1.9V. So, I have a couple of very basic questions.

1) When the package specifies a voltage (2.1 V in this case), does that
mean that the voltage drop across the LED should be 2.1 volts, or does it
mean that I should provide a 2.1 volt source, as I did in the example above?

2) When an LED is rated for 20 - 30 mA, does that mean that I can pull that
many amps through the LED, or should I set the current prior to hooking up
the LED, as I did in the example above.

3) Any good tutorials on using LEDs, and calculating the R values required
for proper usage? I guess that I'm hung up on hooking LEDs up in series
with resistors, and calculating the voltage drops.

Thanks!!!

-Jon
 
M

Michael A. Terrell

Jan 1, 1970
0
Jon said:
I've picked up a couple of yellow LEDs at Radio Shack to tinker around with,
and I've got a few questions...

They are labeled 2.1 volt, 20mA. I recently built a 1.2V - 25V variable
voltage regulator, so I think I have the power supply that I need. At
first, I simply set the voltage on my voltage regulator to 2.1 volts, and
hooked up an LED. Naturally, it got fried in less than a minute. Too much
current. Then, I did a little math, and realized that if I hooked up two
220 ohm resistors in parallel, I could get about 19 mA at 2.1 volts, which
is pretty close to what the LED was labelled. I hooked it up, and it seems
to work.

I double checked everything before I hooked up the LED, and sure enough, I
was getting about 19 mA, as predicted. When I hook up the LED, though, I
get about 2 mA through the circuit. Also, when I measure the voltage across
the LED, I get about 1.9V. So, I have a couple of very basic questions.

1) When the package specifies a voltage (2.1 V in this case), does that
mean that the voltage drop across the LED should be 2.1 volts, or does it
mean that I should provide a 2.1 volt source, as I did in the example above?

2) When an LED is rated for 20 - 30 mA, does that mean that I can pull that
many amps through the LED, or should I set the current prior to hooking up
the LED, as I did in the example above.

3) Any good tutorials on using LEDs, and calculating the R values required
for proper usage? I guess that I'm hung up on hooking LEDs up in series
with resistors, and calculating the voltage drops.

Thanks!!!

-Jon

You need a resistor in series with the LED to limit the current, or a
constant current source, not constant voltage. You subtract the rated
voltage on the LED, and subtract that from the supply voltage Use Ohms
law to calculate the resistance needed for the desired current . You do
not want to push the current to the max rating, or the LED won't last
very long.

Try using Google. There are millions of circuits available online, if
you take a little time to look for them.
 
D

Don Klipstein

Jan 1, 1970
0
I've picked up a couple of yellow LEDs at Radio Shack to tinker around
with, and I've got a few questions...

They are labeled 2.1 volt, 20mA. I recently built a 1.2V - 25V variable
voltage regulator, so I think I have the power supply that I need. At
first, I simply set the voltage on my voltage regulator to 2.1 volts, and
hooked up an LED. Naturally, it got fried in less than a minute. Too
much current.

LEDs often do not do well when connected directly to a voltage source.
The current drawn by them could be almost anything, and LEDs tend to
become more conductive as the temperature rises.
The normal way to use LEDs is to use a dropping resistor and a voltage
higher than that of the LED.
Then, I did a little math, and realized that if I hooked up two
220 ohm resistors in parallel, I could get about 19 mA at 2.1 volts,
which is pretty close to what the LED was labelled. I hooked it up, and
it seems to work.

I double checked everything before I hooked up the LED, and sure enough,
I was getting about 19 mA, as predicted. When I hook up the LED, though,
I get about 2 mA through the circuit. Also, when I measure the voltage
across the LED, I get about 1.9V.

With a 2.1 volt supply and the LED dropping 1.9V, there is only .2 volt
across the resistor.
Assuming the LED drops 2.1V at 19 mA and you need 2.1V to push 19 mA
through the resistor, you need 4.2V to push 19 mA through the series
combination.
So, I have a couple of very basic questions.

1) When the package specifies a voltage (2.1 V in this case), does that
mean that the voltage drop across the LED should be 2.1 volts, or does it
mean that I should provide a 2.1 volt source, as I did in the example above?

That is the typical voltage across the LED at specified current. But
apply this voltage, and the current may be significantly different and may
vary drastically with temperature.
2) When an LED is rated for 20 - 30 mA, does that mean that I can pull
that many amps through the LED, or should I set the current prior to
hooking up the LED, as I did in the example above.

That is how much current can safely flow through the LED.
3) Any good tutorials on using LEDs, and calculating the R values
required for proper usage? I guess that I'm hung up on hooking LEDs up
in series with resistors, and calculating the voltage drops.

I have one - http://www.misty.com/~don/ledd.html

- Don Klipstein ([email protected], http://www.misty.com/~don/ledx.html)
 
G

Gareth

Jan 1, 1970
0
Jon said:
I've picked up a couple of yellow LEDs at Radio Shack to tinker around with,
and I've got a few questions...

They are labeled 2.1 volt, 20mA. I recently built a 1.2V - 25V variable
voltage regulator, so I think I have the power supply that I need. At
first, I simply set the voltage on my voltage regulator to 2.1 volts, and
hooked up an LED. Naturally, it got fried in less than a minute. Too much
current. Then, I did a little math, and realized that if I hooked up two
220 ohm resistors in parallel, I could get about 19 mA at 2.1 volts, which
is pretty close to what the LED was labelled. I hooked it up, and it seems
to work.

I double checked everything before I hooked up the LED, and sure enough, I
was getting about 19 mA, as predicted. When I hook up the LED, though, I
get about 2 mA through the circuit. Also, when I measure the voltage across
the LED, I get about 1.9V. So, I have a couple of very basic questions.

1) When the package specifies a voltage (2.1 V in this case), does that
mean that the voltage drop across the LED should be 2.1 volts, or does it
mean that I should provide a 2.1 volt source, as I did in the example above?

It means that the voltage drop across the LED will be 2.1V when a
particular current is flowing through it, for example it may say

Vf = 2.1V @ 10mA.

which means if you put 10mA through the LED the voltage across it you
should see 2.1V, but it will vary slightly with temperature so don't
worry if you get a slightly different answer. If the current is less
the voltage will be very slightly less, that is probably why you
measured only 1.9V with 2 mA.
2) When an LED is rated for 20 - 30 mA, does that mean that I can pull that
many amps through the LED, or should I set the current prior to hooking up
the LED, as I did in the example above.

That is probably the maximum current that you should put through the LED
3) Any good tutorials on using LEDs, and calculating the R values required
for proper usage? I guess that I'm hung up on hooking LEDs up in series
with resistors, and calculating the voltage drops.

Probably, but there isn't really that much to it.

When you connect your LED and resistor in series the supply voltage is
slit between the two components (as it is for any components connected
in series), so:

V_supply = V_resistor + V_LED

You know that the LED has a forward voltage drop of 2.1V because it says
so on it, that means that the voltage across the resistor will be

V_resistor = V_supply - 2.1

You want a current of less than 20mA for the LED, lets say 10mA, that
should be quite bright and well below the maximum current.

so to calculate the resistor value you have:

R = V_resistor/I

where R is resistor value in k Ohms, and I is current in mA

R = (V_supply - 2.1)/10

you see that you need a supply voltage higher than 2.1 for this to work,
otherwise you get a resistor of zero ohms, then there is nothing to
limit the current and, as you found, the LED will be damaged.

Lets say 6V just as an example

R = (6-2.1)/10 = 0.39k Ohms = 390 Ohms

Or if you want to use one of the 220 Ohm resistors that you have:

V_supply = 10*0.22 + 2.1

= 4.3V, so set the supply voltage to 4.3V

--
 
P

Peter Bennett

Jan 1, 1970
0
I've picked up a couple of yellow LEDs at Radio Shack to tinker around with,
and I've got a few questions...

1) When the package specifies a voltage (2.1 V in this case), does that
mean that the voltage drop across the LED should be 2.1 volts, or does it
mean that I should provide a 2.1 volt source, as I did in the example above?

The voltage across an operating LED is determined primarily by the
chemistry of the LED (and the chemistry also determines the colour),
so you should _never_ connect an LED to a constant voltage source
(like a regulated power supply).
2) When an LED is rated for 20 - 30 mA, does that mean that I can pull that
many amps through the LED, or should I set the current prior to hooking up
the LED, as I did in the example above.

Generally, that would be the _maximum_ rated current for the LED -
greater currents are likely to reduce the LED's life. It is wise to
operate LEDs at somewhat less than their maximum rating - I find most
LEDs are bright enough at 8 - 10 mA.
3) Any good tutorials on using LEDs, and calculating the R values required
for proper usage? I guess that I'm hung up on hooking LEDs up in series
with resistors, and calculating the voltage drops.

The normal (and safe) way to operate an LED is to use a supply voltage
somewhat higher than the LED's rated voltage, and use a resistor in
series to limit the current to the desired value.

Normal red, yellow and green LEDs are 1.7, 1.9 and 2.1 volts, or so -
I'm lazy, so I just say they are all 2 volts, and, as I said, 10 mA
seems to be a nice current - well within ratings for common LEDs. To
determine the resistor value, I subtract the LED voltage from the
supply voltage, and use Ohm's Law: R = E/I. For 5 volts, the resistor
is then (5 - 2)/.01 = 300 ohms. The resistance is not very critical -
a higher value will make the LED a little dimmer, while a lower value
will make the LED brighter (unless you exceed the rated maximum
current.)


--
Peter Bennett, VE7CEI
peterbb (at) interchange.ubc.ca
new newsgroup users info : http://vancouver-webpages.com/nnq
GPS and NMEA info: http://vancouver-webpages.com/peter
Vancouver Power Squadron: http://vancouver.powersquadron.ca
 
J

John Popelish

Jan 1, 1970
0
Jon said:
I've picked up a couple of yellow LEDs at Radio Shack to tinker around with,
and I've got a few questions...

They are labeled 2.1 volt, 20mA. I recently built a 1.2V - 25V variable
voltage regulator, so I think I have the power supply that I need. At
first, I simply set the voltage on my voltage regulator to 2.1 volts, and
hooked up an LED. Naturally, it got fried in less than a minute. Too much
current. Then, I did a little math, and realized that if I hooked up two
220 ohm resistors in parallel, I could get about 19 mA at 2.1 volts, which
is pretty close to what the LED was labelled. I hooked it up, and it seems
to work.

I double checked everything before I hooked up the LED, and sure enough, I
was getting about 19 mA, as predicted. When I hook up the LED, though, I
get about 2 mA through the circuit. Also, when I measure the voltage across
the LED, I get about 1.9V. So, I have a couple of very basic questions.

1) When the package specifies a voltage (2.1 V in this case), does that
mean that the voltage drop across the LED should be 2.1 volts, or does it
mean that I should provide a 2.1 volt source, as I did in the example above?

The forward voltage of any diode at any particular forward current is
hard to predict, exactly. Slight changes, like batch to batch
variations and temperature cause the same voltage drop to occur for
quite a range of possible currents. The 2.1 volt spec is a typical
voltage drop with the full rated 20 milliamps of forward current. The
way to find the actual forward voltage is to pass the specified
current and measure the voltage. But if you apply a stiff (well
regulated) voltage, you will find that a about 25 millivolts change in
total voltage will double or halve the current. Slight errors in
voltage correspond to big swings in current. Which is another way of
saying that big swings in current produce only slight changes in
voltage drop. And as the LED warms up, the current climbs rapidly
with a constant voltage applied. These problems are what make it so
hard to parallel any pair if diodes (have them share the same voltage)
and expect the current to split equally between them.
2) When an LED is rated for 20 - 30 mA, does that mean that I can pull that
many amps through the LED, or should I set the current prior to hooking up
the LED, as I did in the example above.

No matter what you do, the current must be kept below the maximum
rating or the diode will be damaged by heat. If you apply just a tiny
bit extra voltage, the LED will happily pass way too much current and
melt down.
3) Any good tutorials on using LEDs, and calculating the R values required
for proper usage? I guess that I'm hung up on hooking LEDs up in series
with resistors, and calculating the voltage drops.
The rule is quite simple, you assume the forward voltage is about what
the data sheet says, and use a resistor to waste the extra voltage
while passing the desired current at that resistive) voltage drop. As
long as that resistor has a lot more ohms than the incremental
resistance of the diode, it will dominate the current regulation.

So lets say that the LED will go from 10 to 20 ma as the forward
voltage goes from 2.1 to 2.140 volts. That implies an incremental
resistance of (2.140 - 2.1)/(20 ma - 10 ma)=4 ohms. So if the
resistor is at least about 10 times this resistance, it will
effectively over ride the current hogging effects of the diode.

If you want to drive this diode from a 5 volt supply, you assume the
resistor has to waste the extra 5-2.1=2.9 volts. You choose a 10 ma
(.01 amp) operating point (if you use a generic engineering safety
factor of 2) so the resistor has to be about 2.9/.01= 290 ohms. This
is a lot higher than the minimum 40 or so calculated, above, so it
should be quite stable. Then you select the nearest standard value,
270 or 300 ohms. Voila.
 
J

Jon

Jan 1, 1970
0
Thanks for all the info. It seems so easy after reading all of your
explanations that I feel foolish for posting the question in the first
place.

I thought your explanation was particularly enlightening, John.

Jon
 
J

JD Steffen

Jan 1, 1970
0
Okay, so reading though the replys to this post I have learned the proper
way to
determine what value current-limiting resistor to use in an LED circuit. But
I do not
understand one thing. I always thought that a circuit would consume only as
much
current as it needed up to what was available from the supply (until it
blows a fuse
or trips a breaker). So you can see where I am a little confused. How does
an LED
overconsume current to the point where it blows up? Is it some property of
semi
conducters that I am ignorant to? I have no formal education in electronics,
it's
more of a passing interest to me. So please excuse me if this is one of
THOSE
questions ;)

Thanks for any input any of you can provide!

JD
 
J

John Popelish

Jan 1, 1970
0
Jon said:
Thanks for all the info. It seems so easy after reading all of your
explanations that I feel foolish for posting the question in the first
place.

I thought your explanation was particularly enlightening, John.

Thank you. You have learned something fundamental about diodes of all
kinds.
 
J

John Popelish

Jan 1, 1970
0
JD said:
Okay, so reading though the replys to this post I have learned the proper
way to
determine what value current-limiting resistor to use in an LED circuit. But
I do not
understand one thing. I always thought that a circuit would consume only as
much
current as it needed up to what was available from the supply (until it
blows a fuse
or trips a breaker). So you can see where I am a little confused. How does
an LED
overconsume current to the point where it blows up? Is it some property of
semi
conducters that I am ignorant to? I have no formal education in electronics,
it's
more of a passing interest to me. So please excuse me if this is one of
THOSE
questions ;)

Thanks for any input any of you can provide!

Lets take a different case. Lets say you want ot test a fuse to find
out how long it can stand various currents without melting. You have
an adjustable voltage supply rated for a current well above the fuse
rating. How would you go about testing a 100 ma fuse to find out how
it reacts to 200 ma using this supply? The fuse has a cold resistance
that you can measure, but changes resistance (increases somewhat) when
it heats up.

This problem has a lot to do with driving LEDS, in that the device
being tested has a low resistance, and a rating based on current.

No matter what voltage you set the supply to, the current will not
stay at the desired 200 ma. because of the increase in resistance as
the fuse heats up. The current in any diode will go up as the device
heats up.

You will have to add some current regulating mechanism to the circuit
and provide enough extra voltage to let this mechanism operate in
order to get a nearly constant desired current through the fuse during
the test.

Do you see the similarity between this test circuit and the one used
to operate LEDs?

Some devices are voltage operated and draw an appropriate current if
you apply the appropriate voltage. Integrated logic chips, opamps and
resistive heaters fall into this category. Other devices deal
primarily in the realm of current and only operate correctly if
something else in the circuit controls the available current instead
of the applied voltage. LEDs and fuses are examples.
 
D

dB

Jan 1, 1970
0
JD Steffen said:
Okay, so reading though the replys to this post I have learned the proper
way to
determine what value current-limiting resistor to use in an LED circuit. But
I do not
understand one thing. I always thought that a circuit would consume only as
much
current as it needed up to what was available from the supply (until it
blows a fuse
or trips a breaker). So you can see where I am a little confused. How does
an LED
overconsume current to the point where it blows up? Is it some property of
semi
conducters that I am ignorant to? I have no formal education in electronics,
it's
more of a passing interest to me. So please excuse me if this is one of
THOSE
questions ;)

Thanks for any input any of you can provide!

JD



Part of a posting in another forum (the link will take you to the
particular FAQ) .....


Have another look at the "generic diode" curve (in the l.e.ds and
zeners thread) in the FAQ section.

Both "knees" are similar. If you apply an increasing voltage (either
forward or reverse) very little current will flow until the critical
knee voltage is reached.

We use an l.e.d by biasing it in the forward direction, and use a
zener by biasing it in the reverse direction.

You MUST get used to thinking that l.e.ds and zeners
are CURRENT operated devices, and that the current has to be limited
with a series resistor.

We DO pass a current through it and a certain voltage will be
developed across it.

We DON'T apply a voltage across it and cause a current to flow through
it.



http://pub40.ezboard.com/fbasicelectronicsfrm5.showMessage?topicID=16.topic
 
O

Olaf

Jan 1, 1970
0
But I do not understand one thing. I always thought that a circuit
would consume only as much current as it needed up to what was available
from the supply (until it blows a fuse or trips a breaker). So you can
see where I am a little confused. How does an LED overconsume current to
the point where it blows up? Is it some property of semiconducters that
I am ignorant to?

when considering home-equipment like vacuum-cleaners, lightbulbs etc used
in the normal way, you are right: each device generates exactly the
current it needs to operate and not more. But these are devices designed
to be used safely on a wall-outlet. When considering a single component or
designing circuits you cannot assume things work out all right. This also
applies to conductors!

When a voltage is applied to such a component or circuit it will generate
a current. But not the current it 'needs', it generates the current it
can. More precise, it will limit the current only as much as it can. So if
you connect a simple 5000 ohm (R) resistor to a 10V DC (U) powersource, it
will limit the current to I = U/R = 10/5000 = 0.002 Ampere. Simple, no
problem. But now connect a 5 ohm resistor to the same powersource. This
time the current will be I=10/5= 2 Ampere. 2 Ampere could blow a fuse in
the powersource, but let's assume it doesn't. Now the resistor will
generate heath: P = V*I = 10 * 2 = 20 Watts. 20 Watts of heat is quite a
lot, a normal resistor will be gone before you know what has happened.

Now for the led: current can only pass through in one direction and when
connected that way directly to the powersource the led offers no
resistance to the current, it will not limit the current the same way the
resistors did in the example above. So the current will become very large
and something will blow. Could be the fuse, probably it will be the led.
If you don't want the led to blow you have to find a way to limit the
current. Resistors are capable of doing this, that's why you connect one
in series with the led. You've found the calculations in the other part of
the thread.

I hope this explains a bit, bye, Olaf
 
J

JD Steffen

Jan 1, 1970
0
Ah, that makes sense now. Thanks everyone for the responses. It
has helped alot.

Thanks,

JD
 
Top