Maker Pro
Maker Pro

voltage divider or series current resistor

N

NJM

Jan 1, 1970
0
I am just a beginner so forgive me if this question is stupid.

On a simple circuit, like 2 LEDs with a 12v supply, which would be better,
a voltage divider to lower the voltage or a series resistor to restrict
the current? Or am I completly wrong in my understanding of a voltage
divider?

Please correct me if I am wrong, but it seems the divider would actually
waste some energy because it is being sent to ground.

What is the determining factor in deciding which to use?

Is the reason a series resistor with LEDs because the LED has a resistance
of its own therefore creating a sort of divider?
 
T

tempus fugit

Jan 1, 1970
0
An LED MUST have a series resistor to limit the current or it will burn out
in a fee milliseconds. It's not a question of lowering the voltage, it's one
of limiting the current to a safe value. LED's need a current of 10-20mA
through them to glow fairly brightly, so select a resistor for each LED to
supply this.
 
H

HKJ

Jan 1, 1970
0
NJM said:
Please correct me if I am wrong, but it seems the divider would actually
waste some energy because it is being sent to ground.
Correct

What is the determining factor in deciding which to use?

When the load uses a constant current a series resistor is enough.
Is the reason a series resistor with LEDs because the LED has a
resistance of its own therefore creating a sort of divider?

Sort of.


For calculating led circuits, try this program:
http://www.miscel.dk/MiscEl/miscelLeds.html

It can also do a lot of other electric calculations:
http://www.miscel.dk/MiscEl/miscel.html
 
J

John Popelish

Jan 1, 1970
0
NJM said:
I am just a beginner so forgive me if this question is stupid.

On a simple circuit, like 2 LEDs with a 12v supply, which would be
better, a voltage divider to lower the voltage or a series resistor to
restrict the current? Or am I completly wrong in my understanding of a
voltage divider?

Please correct me if I am wrong, but it seems the divider would actually
waste some energy because it is being sent to ground.

What is the determining factor in deciding which to use?

Is the reason a series resistor with LEDs because the LED has a
resistance of its own therefore creating a sort of divider?

--Using Opera's revolutionary e-mail client: http://www.opera.com/mail/

Basic principles:

LEDs produce light in rough proportion to the current
passing through them.

LEDs, like most junction diodes drop a voltage that is not
proportional to the current through them, the way resistors
do. Very small changes in voltage produce large changes in
current. So voltage drive is not practical.

Different colored LEDs drop different voltages, with shorter
wavelengths generally dropping more. For example, most red
LEDs will light with less than 2 volts across them, while
blue ones often drop more than 3.

Your method should at least roughly control the current
passing through the fairy fixed voltage drop of the LEDs.

The series resistor does that a lot better than the voltage
divider. Besides, the voltage divider wastes current that
does not pass through the LEDs.

So lets say that you want to light 3 green LEDs that drop
about 3 volts each, when their operating current is in the
normal range. You select an operating current of about 10 mA
(because, say, it is safely below their maximum current
rating and you judge, from some experimentation, that this
produces a reasonable amount of light). So the LEDs, in
series, will drop about 6 of the 12 volt supply. The
resistor must be selected to drop the remaining 6 volts
while passing about 10 mA. 6V/.01A=600 ohms. The next
higher standard 5% resistor would be 620 ohms and the next
lower one would be 560 ohms. Take your pick. The resistor
will produce some heat, so you might want to check that you
have one large enough to cook off that power. You can
calculate the power dumped into the resistor either with
P=amps times volts, volts squared divided by ohms, or amps
squared times ohms. Since we are trying to achieve 6 volts
drop while passing .01 amp, the power is 6V*.01A=0.06 watts,
so even a 1/8th watt resistor would be large enough.
 
B

Byron A Jeff

Jan 1, 1970
0
I am just a beginner so forgive me if this question is stupid.

This is a newsgroup for beginners. Welcome.
On a simple circuit, like 2 LEDs with a 12v supply, which would be better,
a voltage divider to lower the voltage or a series resistor to restrict
the current?

While LEDs are voltage controlled devices, the critical parameter is current.
Or am I completly wrong in my understanding of a voltage divider?

I think you may have gaps in your understanding of LEDs.
Please correct me if I am wrong, but it seems the divider would actually
waste some energy because it is being sent to ground.

True. That's how they work.
What is the determining factor in deciding which to use?

The specifications of the LED. Specifically the maximum continuous allowable
current.
Is the reason a series resistor with LEDs because the LED has a resistance
of its own therefore creating a sort of divider?

Interesting question. The answer is no. The LED provides minimal resistance
once sufficient voltage to light it is present.

The answer is that an LED (like all diodes) will conduct all available current
once they turn on. The problem is because of construction, if too much current
goes across, then the diode burns up.

So the purpose of the resistor is to limit the amount of current in the
circuit. Off the top of my head Kirchoff's law states that the amount of
current across every component in a circuit is equal. A resistor impeads
current based on its resistance. The LED is a virtual dead short once it
turns on. So when you combine the two together, the amount of current that
flows across the LED is the same as the current across the resistor. Hence
the name current limiting resistor.

Hope this helps. You don't need a voltage divider. The LED will consume
whatever voltage required to light it (its forward voltage drop) leaving the
rest of the voltage to the rest of the circuit (which is the resistor).

A quick example. I once modified a cheap motion detecting floodlight for
some security testing by connecting the 120VAC output to the LED of an
optoisolator. Now since that circuit powered a 150W incandecent floodlight,
there was well over an amp of power @ 120VAC available. Definitely needed a
current limiting resistor.

The 2V that the LED actually comsumes is negligable. So to simplify my
calculation, I used a 130VAC (added some slop) voltage and wanted 15mA across
the resistor. So:

V=IR
130V=0.015A * R
R = 8666 ohms.

I think for good measure I used a 10K resistor giving a final current of 13mA.
Finally I needed to compute the power

P=VI
P=130V*0.013A
P=1.69W

So I pulled a 2W 10K resistor for the project.

Finally LEDs (and other diodes) don't care for high reverse voltages either.
So I put a second visible LED in reverse parallel to the opto LED. So one
was on for half the AC cycle, and the other on for the other half. Plus it
gave me a visual indicator of when the motion detector activated.

Note that all of this occured with a 120 VAC input. No need to divide the
voltage as long as the current is kept under control.

Hope this helps.

BAJ
 
A

austingoofball

Jan 1, 1970
0
I am just a beginner so forgive me if this question is stupid.

On a simple circuit, like 2 LEDs with a 12v supply, which would be better,
a voltage divider to lower the voltage or a series resistor to restrict
the current? Or am I completly wrong in my understanding of a voltage
divider?

Please correct me if I am wrong, but it seems the divider would actually
waste some energy because it is being sent to ground.

A little perhaps. When you put the LED into the split off of the the
divider, you're basically going to be shorting the bottom resistor to
ground. Not quite because there is a small voltage drop across the
diode, and thus a small one across the the bottom resistor. Most of
the voltage will be dropped across the top resistor, so in effect you
have a current limiting resistor with another resistor that bleeds a
little bit of current off to ground.

The voltage divider process depends on the fact that the resistance of
whatever is being driven by the divider is much greater than the
Thevenin resistance of the divider (in this case, the upper and lower
resistances in parallel). If it is, then the output Voltage of the
divider given by the Voltage Divider equation. If it isn't, then the
output voltage sags due to the voltage dropped across the upper
resistor.
 
M

MassiveProng

Jan 1, 1970
0
An LED MUST have a series resistor to limit the current or it will burn out
in a fee milliseconds. It's not a question of lowering the voltage, it's one
of limiting the current to a safe value. LED's need a current of 10-20mA
through them to glow fairly brightly, so select a resistor for each LED to
supply this.


Fairly good answer aside from you being a top posting Usenet retard.

You seem to forget that there are 1 amp LEDs now.

Yes, OP (original poster), they MUST be current limited, and that
limit is defined by their declared specs from their manufacturer.
 
Top