A simple question, just to know about theory.

Suppose I have 12v DC @ 1A, 1.5 LED.

How much power resistor do I need to use to glow the Diode?

Should I use 12 ohms as,

R = V/I = 12/1 = 12 ohms

or should I use 10.5 ohms as,

R = V/I = (12 - 1.5)/1 = 10.5 ohms

(as 1.5 LED will consume 1.5v).

Thank you.

Saketram

first thing, you 12V at 1 Amp power supply - this current is what it

*can* supply. I don't know where this notion that you can "push"

current through something comes from.

Your PSU will give 12v (forget the current for now) an LED will "drop"

around 2 volts and a common red LED will need between 0.015 and 0.02

Amps to get it to light nicely. I put the current like that so you can

relate how tiny it is in relation to the 1 amp maximum of your PSU.

so 12 - 2 =10v to drop across the resistor. As you are after 20mA (or

so), the maths is simply 10/0.02 which gives 500(ohms) Generally

resistors all come in set sizes, either E12 or E24 - google this. What

this means is that you can't generally go and "buy" a 500 ohm resistor

so you have to choose a value near to it. If you drop the resistor

value, more currnt will flow into your LED and this might shorten it's

life so a lot better to limit the current still further. I would go

for a 560 ohm resistor - this is a standard value and will be about

$0.02 each.

You might like to work out what the current will be using a 560 Ohm

resistor (instead of 500) and if you are sure your LED will drop 1.5

volts instead of the more usual 2, you could work out the exact value

required for your 20mA and then choose a better one from the standard

ranges