ShadowTek said:

I just bought 50 3.2V 20mA white LEDs for about $30US total off of

EBay.

http://cgi.ebay.com/ws/eBayISAPI.dll?ViewItem&category=66952&item=3847009435&rd=1&ssPageName=WDVW
I tried to read up on electrical stuff but there is a lot I dont

understand.

For instance, if I use 2 1.5V AA betteries in a series that are 3 Amps

each,

Saying that the batteries are 3 amps each is a bit nonsensical.

Batteries have an internal resistance (that consumes some of the

voltage as current passes through it and limits the short circuit

current to the value that uses up all of the voltage across the

internal resistance) and batteries have an ampere hour rating that

generally tells how much current it can supply for how long before it

is exhausted. Cut the current in half and the life approximately

doubles.

...then is the output 3 Amps or 6 Amps?

If you are talking about short circuit current, it stays the same, 3

amps, because each cell drops its entire internal voltage across its

internal resistance at its short circuit current.

I know voltage doubles when connecting cells in a series but what

about current?

Connecting cells in series doubles the total internal resistance while

providing not more capacity (ampere hours).

Assumeing that the total current of the two batteries were 3 Amps,

does that mean that these 2 batteries alone could power 150 of these

3V 20mA LEDs if the LEDs were connected in series?

Not if the 3 amperes is a short circuit current rating, That is the

current that reduces the battery voltage to zero, so it can't drive

anything at that current. If that rating is the ampere hour rating of

the cells, then, you could power a 20 ma load for 3/.02=150 hours, a 1

amp load for 3/1=3 hours, etc. Whether the output voltage is good

enough to force any particular amount of current through an LED is

another matter, entirely.

The LED ratings of 3.2 volts and 20 ma implies that it takes

approximately 3.2 volts across each led before they pass 20 ma of

current. However, LEDs are not resistors, and their current is very

strongly related to the applied voltage. A few millivolts extra can

double the current, and a few millivolts less will cut the current by

half. Also, a small change in temperature can shift the current

dramatically, if the voltage is fixed. This makes it very difficult

to power LEDS directly and in parallel from a stiff (low internal

resistance) voltage source. Normally some additional stuff is needed

to waste a bit of extra voltage and regulate the current. For

instance, if you used 3 cells in series for an open circuit voltage of

about 4.5 volts, you could add a series resistor to each of those

parallel LEDs and if you pick the right resistance it will waste the

extra voltage (4.5-3.2=1.3 volts whenever the current is about 20 ma.

So 1.3 volts divided by .02 amp predicts that this value is about 65

ohms. Standard 5% values are 62 and 75. If you want ot get maximum

brightness you might use the lower resistance, but if you want ot make

sure that you operate the LED well within its ratings you might use a

75 to 100 ohm resistor with a small drop in brightness. How many of

these sets you can connect across the battery depends on the sag in

voltage caused by the internal resistance and the battery life you

desire.

That doesn't sound correct.

Or would that need to be parallel?

Maybe that would work but only for like a few minutes before the cells

were completely drained?

How many of these LEDs could I hook up to 2 AA (wired how?)cells

without using any other form of resistor other than the LEDs

themselves?

If the LEDs take 3.2 volts and the battery produces 3 volts, even with

no current load, then zero.