I am just a beginner so forgive me if this question is stupid.
This is a newsgroup for beginners. Welcome.
On a simple circuit, like 2 LEDs with a 12v supply, which would be better,
a voltage divider to lower the voltage or a series resistor to restrict
the current?
While LEDs are voltage controlled devices, the critical parameter is current.
Or am I completly wrong in my understanding of a voltage divider?
I think you may have gaps in your understanding of LEDs.
Please correct me if I am wrong, but it seems the divider would actually
waste some energy because it is being sent to ground.
True. That's how they work.
What is the determining factor in deciding which to use?
The specifications of the LED. Specifically the maximum continuous allowable
current.
Is the reason a series resistor with LEDs because the LED has a resistance
of its own therefore creating a sort of divider?
Interesting question. The answer is no. The LED provides minimal resistance
once sufficient voltage to light it is present.
The answer is that an LED (like all diodes) will conduct all available current
once they turn on. The problem is because of construction, if too much current
goes across, then the diode burns up.
So the purpose of the resistor is to limit the amount of current in the
circuit. Off the top of my head Kirchoff's law states that the amount of
current across every component in a circuit is equal. A resistor impeads
current based on its resistance. The LED is a virtual dead short once it
turns on. So when you combine the two together, the amount of current that
flows across the LED is the same as the current across the resistor. Hence
the name current limiting resistor.
Hope this helps. You don't need a voltage divider. The LED will consume
whatever voltage required to light it (its forward voltage drop) leaving the
rest of the voltage to the rest of the circuit (which is the resistor).
A quick example. I once modified a cheap motion detecting floodlight for
some security testing by connecting the 120VAC output to the LED of an
optoisolator. Now since that circuit powered a 150W incandecent floodlight,
there was well over an amp of power @ 120VAC available. Definitely needed a
current limiting resistor.
The 2V that the LED actually comsumes is negligable. So to simplify my
calculation, I used a 130VAC (added some slop) voltage and wanted 15mA across
the resistor. So:
V=IR
130V=0.015A * R
R = 8666 ohms.
I think for good measure I used a 10K resistor giving a final current of 13mA.
Finally I needed to compute the power
P=VI
P=130V*0.013A
P=1.69W
So I pulled a 2W 10K resistor for the project.
Finally LEDs (and other diodes) don't care for high reverse voltages either.
So I put a second visible LED in reverse parallel to the opto LED. So one
was on for half the AC cycle, and the other on for the other half. Plus it
gave me a visual indicator of when the motion detector activated.
Note that all of this occured with a 120 VAC input. No need to divide the
voltage as long as the current is kept under control.
Hope this helps.
BAJ