Maker Pro
Maker Pro

Reduce amperage without reducing voltage. Diodes? [Noob]

Electric Kevin

Jun 4, 2015
4
Joined
Jun 4, 2015
Messages
4
Hi there,
I'm starting to get interested in electronics a lot!
Of course I've been trying out stuff and messing with some things already.
Yesterday I made an LED glow, using a 5.7 V/800mA charger cable, using a resistor to decrease the voltage.
Are LEDs supposed to get noticeably hot and ... break? xP

Anyway, so since my LED is no more, I found something else to play with: A small 4cm x 4cm fan, which apparently is supposed to operate at 12 V and 130 mA. I found myself another charger which has the desired 12 V, but 1 ampere.

The first thing I thought was "Fine, I'll just put a resistor in there" But that would also reduce the volts, right?
Now how do I reduce the amperage without reducing the voltage?
I've been searching for answers for a while, with lots of different answers, which confuses and frustrates me,
especially as a beginner.
Some people said the supply ampere can be higher, and the device will just draw as much as it needs.
Other people said that this is not true. And one guy said you would use something called a zener diode.

Is that correct? And if it is, are there different types of those diodes, like with color coded resistors?
If this works as I think, you would reduce the electric power with a resistor to reach the desired ampere level,
and then use a diode (or 2?) to increase the voltage back to 12 volts again. And the amperes would stay the same?

Anyone know the answer?
 

signalman72

Jan 26, 2014
57
Joined
Jan 26, 2014
Messages
57
The fan will draw only the current it needs at a given voltage. If 12 volts, then using a 1 amp supply to run a 130 mA fan will be fine, no resistor needed. The LED on the other hand needs a current limiting resistor if fed a voltage higher than its forward voltage rating.

There are online calculators that will tell you what ohm resister to use on an LED for a given forward voltage and forward current rating.

http://ledcalculator.net
 
Last edited:

Merlin3189

Aug 4, 2011
250
Joined
Aug 4, 2011
Messages
250
...
Some people said the supply ampere can be higher, and the device will just draw as much as it needs.
Other people said that this is not true. And one guy said you would use something called a zener diode.
I'm afraid the first people are correct. Your 12V 1A supply will give approximately 12V and will supply UP TO 1Amp depending on the load. If you TRY to take more than 1A, then you may get more than 1A, the voltage may reduce, the power supply may overheat and it may fail. It is not a good idea. If you take less than 1A, even 0A, then it will be quite happy.

Your problem with the LED was that your 5.7V 0.8A power supply was too great a voltage for the LED, so it tried to make the LED take more current than it likes. The LED took it, and reacted in the way most components do if they are forced to take more current than they want - enjoy it for a while, then get hot and die.
It is the Voltage which is the guilty party. Only voltage can force current where it is not wanted.
The current rating of a power supply is just the most current it is happy to supply. It may try to supply more, but just like any other component that overdoes the current, it is likely to get hot and die.

Zeners do not increase voltage.
 

Gryd3

Jun 25, 2014
4,098
Joined
Jun 25, 2014
Messages
4,098
Hi there,
I'm starting to get interested in electronics a lot!
Of course I've been trying out stuff and messing with some things already.
Yesterday I made an LED glow, using a 5.7 V/800mA charger cable, using a resistor to decrease the voltage.
Are LEDs supposed to get noticeably hot and ... break? xP

Anyway, so since my LED is no more, I found something else to play with: A small 4cm x 4cm fan, which apparently is supposed to operate at 12 V and 130 mA. I found myself another charger which has the desired 12 V, but 1 ampere.

The first thing I thought was "Fine, I'll just put a resistor in there" But that would also reduce the volts, right?
Now how do I reduce the amperage without reducing the voltage?
I've been searching for answers for a while, with lots of different answers, which confuses and frustrates me,
especially as a beginner.
Some people said the supply ampere can be higher, and the device will just draw as much as it needs.
Other people said that this is not true. And one guy said you would use something called a zener diode.

Is that correct? And if it is, are there different types of those diodes, like with color coded resistors?
If this works as I think, you would reduce the electric power with a resistor to reach the desired ampere level,
and then use a diode (or 2?) to increase the voltage back to 12 volts again. And the amperes would stay the same?

Anyone know the answer?
Would it help to say there is no simple answer?

Voltage = Current * Resistance...
So, if you know the voltage, and you know the resistance the current will be the unknown.
Similarly, if you know the current and resistance, the voltage will be unknown.

Now, the vast majority of chargers you have in the house will be 'Constant Voltage' . It will be rare for you to see a constant current source from consumer equipment.
This means that you know the voltage (12V) and when you hook something up, you will know the resistance... that just leaves the current. The charger you had is capable of putting out UP TO 1A, but will try it's hardest to keep putting out 12V

The other thing to consider is that a resistor, led, and motor all behave differently.
A resistor's resistance will not change so you always know how much current will flow through.
A motor's resistance will change based on how hard it's working. If the motor is trying really hard, the resistance will drop (Which... if you look at the equation again, lower resistance = higher current because our voltage stays the the same)
An LED will be the trickiest part to understand, but it's resistance is highly variable. So much so that you won't get a reading from it. If you start at 0 volts, and very slowly start to increase it to 3 volts you will find that the LED has a very very high resistance at first, then (depending on the LED's forward voltage) as you approach 1.2v the resistance will drop considerably, and as you go higher... the resistance will drop even faster! As you know from before, lower resistance=higher current... the LED will cook itself.
What's worse, is an LED does not behave exactly the same way, the rate at which it changes like that is dependant on temperature, manufacturing, etc... so the best way to drive an LED is a device called a 'constant current driver' This works opposite to the 12V adaptor you are using... instead of keeping the voltage exactly at 12V, a constant current driver will change it's voltage to make sure it's always putting out exactly 20mA (or whatever it's set to). A simpler way of doing this, is to put a resistor in-line with the LED. This resistor will act as a buffer so that even if the resistance of the LED changes the 'total' resistance for the circuit will not change enough to cook the LED. In all actuality though, and LED should never be treated as a resistor, that's just the best way I could describe how they work. LED's should be looked at like a diode. If you hook up 12V to an LED, it will drop a small voltage across itself and the rest should drop across a resistor. If there is no resistor, the LED can runaway.

I've attached an image to show you what happens to an LED as you increase voltage to it.
diode12.gif
 

BGB

Nov 30, 2014
154
Joined
Nov 30, 2014
Messages
154
one thing I had noticed before, but that had come up unexpectedly, was the whole LEDs and resistance thing.

previously, most of my LEDs had been ones that came out of computer front panels and similar, which you just connect up to 5v and they work, having somewhere around 1k of resistance or so (I had figured initially that they all were like this, so had not bothered with resistors).

later had used some nearly identical looking LEDs, which were bought separately. connected one of these up to a power source, and it changed color and then melted, having almost no resistance once it turned on.

had never really found a solid answer here (such as why this is), just it seems that some have built-in resistors, and others don't.
 

Electric Kevin

Jun 4, 2015
4
Joined
Jun 4, 2015
Messages
4
Your problem with the LED was that your 5.7V 0.8A power supply was too great a voltage for the LED, so it tried to make the LED take more current than it likes.

As I said for the LED I used a resistor though, to get the voltage and ampere down. But I checked again, it seems that I accidentally used a 1.50 ohm resistor instead of 120 ohms. Whoops! :p
It was a DUO diode btw.
 

BobK

Jan 5, 2010
7,682
Joined
Jan 5, 2010
Messages
7,682
You probably have electrical outlets in your home. You can plug in a room heater which will probably take 10A or more. You can also plug in a night light which will take < 100mA. Does this tell you anything?

Bob
 

Electric Kevin

Jun 4, 2015
4
Joined
Jun 4, 2015
Messages
4
You probably have electrical outlets in your home. You can plug in a room heater which will probably take 10A or more. You can also plug in a night light which will take < 100mA. Does this tell you anything?

Bob
It tells me that the devices probably have the resistors they need :p
 

davenn

Moderator
Sep 5, 2009
14,254
Joined
Sep 5, 2009
Messages
14,254
had never really found a solid answer here (such as why this is), just it seems that some have built-in resistors, and others don't.

I don't know of any standard LEDs with built in resistors ... there was likely a resistor in series with the LED inside a bit of heatshrink tubing that would be normal

and if the resistor wasn't up at the end of the cable by the LED, then there would be a resistor on the motherboard where the LED is feeding from

Dave
 

davenn

Moderator
Sep 5, 2009
14,254
Joined
Sep 5, 2009
Messages
14,254
It tells me that the devices probably have the resistors they need :p

close by not quite

instead, see it as the overall resistive load that the device puts across the power supply, be that your mains wall outlet or a battery etc

Dave
 

BGB

Nov 30, 2014
154
Joined
Nov 30, 2014
Messages
154
I don't know of any standard LEDs with built in resistors ... there was likely a resistor in series with the LED inside a bit of heatshrink tubing that would be normal

and if the resistor wasn't up at the end of the cable by the LED, then there would be a resistor on the motherboard where the LED is feeding from

Dave

here is an example, so they seem to exist in any case:
http://www.mouser.com/new/kingbright/kingbright-resistor-LEDs/

from the ones I had scavenged before, no external resistor was used, it was just the wires connected up directly to the LEDs (on one end, the other ends would be dupont connectors which would plug into the relevant pin-headers), and if they were connected up to another 5v source, they would "just work".

apparently, it is pretty much standard for the case LEDs to include a built-in resistor, but a lot of MOBOs will also include a resistor "just in case".


as noted, other (standard) LEDs will burn up if connected to a power-source with no resistor. I had previously been unaware of this, and had fried some LEDs I had bought later (rather than scavenged), thinking they behaved like the ones from PC cases.
 

davenn

Moderator
Sep 5, 2009
14,254
Joined
Sep 5, 2009
Messages
14,254
here is an example, so they seem to exist in any case:
http://www.mouser.com/new/kingbright/kingbright-resistor-LEDs/

cool :) thanks for that

from the ones I had scavenged before, no external resistor was used, it was just the wires connected up directly to the LEDs (on one end, the other ends would be dupont connectors which would plug into the relevant pin-headers), and if they were connected up to another 5v source, they would "just work".

prob depends on manufacturer but now I will have to check the 2 old cases ( led than 10 yrs old) I have here and see how the LEDs are wired ... looking closely at a LED with an inbuilt resistor should almost be obvious
Im officially intrigued :)

Dave
 

cjdelphi

Oct 26, 2011
1,166
Joined
Oct 26, 2011
Messages
1,166
Resistance is measured in ohms.

Resistance is what stops your electronics melting, with exception to diodes, semiconductors, batteries what determines current is the resistance between +/- ..

How much current is determined by the voltage and your voltage source..

1v/1ohm = 1amp
 

Electric Kevin

Jun 4, 2015
4
Joined
Jun 4, 2015
Messages
4
Resistance is measured in ohms.

Resistance is what stops your electronics melting, with exception to diodes, semiconductors, batteries what determines current is the resistance between +/- ..

How much current is determined by the voltage and your voltage source..

1v/1ohm = 1amp
Can you tell me how I can calculate the required ohms?
For example if I want to light my LED (which somehow slowly started working again, with the right resistor , yay!)
with a 12v and 1A power supply, but the LED is supposed to run at 2.5v and 30 mA, how do I calculate the required ohms?
 

Gryd3

Jun 25, 2014
4,098
Joined
Jun 25, 2014
Messages
4,098
Can you tell me how I can calculate the required ohms?
For example if I want to light my LED (which somehow slowly started working again, with the right resistor , yay!)
with a 12v and 1A power supply, but the LED is supposed to run at 2.5v and 30 mA, how do I calculate the required ohms?
Well.. there are tons of resources available. "LED resistor Calculator"

What you need are the following:
Source voltage : 12V
LED forward voltage : (Unknown... depends on color and is usually on the LED datasheet. Look at the graph in my previous post for rough values for each color)
Desired current : 30mA

Because of the complicated nature of the LED, you 'cant' control both the voltage across it AND the current at the same time. The voltage across a properly driven LED will vary a little, and this is normal.
So... if you have the details from above, these are the steps you need to follow :
V = I * R (Voltage = Current * Voltage)
SourceVoltage - LEDForwardVoltage(*number of LEDs in the string) = DesiredCurrent * Resistance
12V - 1.2V (common for red LEDs) = 30mA * Resistance
Resistance = 10.8V / 0.030A = 360Ω

One more step though...
Power rating of your resistor!
Power = Voltage * Current.
Power = 10.8V * 0.030A = 324mWatts (Rounding to next size up, 1/2Watt resistor is required)
 

cjdelphi

Oct 26, 2011
1,166
Joined
Oct 26, 2011
Messages
1,166
Regarding diodes, transistors etc...

It's important to remember that these devices have a forward voltage drop, 0.7v for a diode (or regular transistor, not darlington)

So you have to deduct voltage drops before calculating the resistor value.
 

Kevin Kearney

Mar 11, 2017
1
Joined
Mar 11, 2017
Messages
1
Hi there,
I'm starting to get interested in electronics a lot!
Of course I've been trying out stuff and messing with some things already.
Yesterday I made an LED glow, using a 5.7 V/800mA charger cable, using a resistor to decrease the voltage.
Are LEDs supposed to get noticeably hot and ... break? xP

Anyway, so since my LED is no more, I found something else to play with: A small 4cm x 4cm fan, which apparently is supposed to operate at 12 V and 130 mA. I found myself another charger which has the desired 12 V, but 1 ampere.

The first thing I thought was "Fine, I'll just put a resistor in there" But that would also reduce the volts, right?
Now how do I reduce the amperage without reducing the voltage?
I've been searching for answers for a while, with lots of different answers, which confuses and frustrates me,
especially as a beginner.
Some people said the supply ampere can be higher, and the device will just draw as much as it needs.
Other people said that this is not true. And one guy said you would use something called a zener diode.

Is that correct? And if it is, are there different types of those diodes, like with color coded resistors?
If this works as I think, you would reduce the electric power with a resistor to reach the desired ampere level,
and then use a diode (or 2?) to increase the voltage back to 12 volts again. And the amperes would stay the same?

Anyone know the answer?
If you do need to reduce current you have to use a current limiting resistor
Which is a resistor tied to ground before the component you are supplying so 12v across 14 ohms would be 12v drop and 0.857amps grounded which would leave you 0.143amps for the component as the max current is 1 amp this protects the component from ever drawing to much current. Stun gun which supply 50000 volt don't supply much current because of current limiting resistor.
 

Merlin3189

Aug 4, 2011
250
Joined
Aug 4, 2011
Messages
250
Kevin K, I don't think you have the current limiting resistor idea quite right here. You seem to be suggesting a resistor in parallel with the source, to drain off the excess current:
12V 1Amp supply, put 14 Ohm in parallel to draw 12V/14Ohm = 857mA, leaving 1A - 857mA = 143mA for the diode (with still 12V across it!)
But it doesn't really work like that, for two reasons:

First, the 1A supply doesn't supply exactly 1A. That is simply the maximum current which can safely be drawn. It can supply less - down to 0 - but it can probably supply more, maybe a lot more. (There are some current-limited power supplies which will stop at the specified maximum current, but most cheap wall warts don't. If you did get a current limited supply, then it would not be at 12V when it was limiting. You can't have both at the same time.)

Second, if the power supply tries to supply 1A at 12V (even if it is current limited) and you put your 14 Ohm resistor in parallel, the greedy little LED will still try to grab most of the current, leaving maybe 500mA for the resistor.
If you look back to post #4 from Gryd3, he has given you some nice graphs showing current vs voltage for some LEDs. What they show is that LEDs do not behave like resistors, where current is proportional to voltage. For LEDs a small % change in voltage can produce a large % change in current. To get a graph a bit more like Electric Kevin's LED I've uploaded this one.led_i_v_curve.jpg
What you can see here is: below 2V hardly any current flows at all. By the time you get 3V across the LED, it is drawing around his 143mA - great! But if you apply a little more voltage, say 3.5V, it draws over 500mA. If you increased to voltage to 12V, you'd be right off the top of the graph, drawing well over 1A (but the LED would be dead by then!)
Putting your 14 Ohm in parallel would not help. Say it was there drawing its 857mA and you connected your LED, then there would be only 143mA left to go through the LED. Ok, BUT, that would only make the voltage across the LED 3V and the voltage across the parallel resistor must therefore also be only 3V. Then your 14 Ohm resistor would only take 3V/14 Ohm = 214 mA, now leaving 786mA to go through the LED. That would raise the LED voltage a bit and that would make the resistor take a bit more current. But it would settle down around 3.5V with about 750mA going through the LED and 250mA through the resistor.

The correct way to use a "current limiting resistor" (bad name btw*) with a LED is to put it in SERIES. If your LED wants 143mA at 3V, then you need a series resistor that uses the rest of the 12V at 143mA , so (12V - 3V) = 9V, the resistor is 9V/143mA = 63 Ohm.
Then if the LED tries to draw a bit more current, that increases the current through the series resistor, which means (Ohms law) it takes a bit more voltage and lowers the voltage on the LED and reduces the current the LED can pass. Stability.

* It doesn't limit the current to the desired value, but stabilises it around that value. So better might be "current stabilising resistor", or the old fashioned "ballast resistor", or plain and simple, "series resistance".
 

Audioguru

Sep 24, 2016
3,656
Joined
Sep 24, 2016
Messages
3,656
I have many LEDs that look like ordinary LEDs but they have 3 colors and a microprocessor inside that lights the colors in many sequences. They do not need a resistor because it is inside.

I am posting the datasheet of an ordinary red LED for you to see its maximum allowed current, and how to calculate a resistor to limit the current:
 

Attachments

  • LED resistor calculation.png
    LED resistor calculation.png
    81.6 KB · Views: 281

Audioguru

Sep 24, 2016
3,656
Joined
Sep 24, 2016
Messages
3,656
Diodes will reduce the voltage and will use some of the ESC power that the motor would have taken.
Can't you program your RC transmitter to limit the maximum power from the ESC by limiting the motor's speed?
 
Top