Rory Starkweather
- Nov 13, 2014
- 77
- Joined
- Nov 13, 2014
- Messages
- 77
If I have a 2 Amp regulated supply, will the attached circuit have to drop the whole 2 amps, like voltage, or will it just take what it needs?
Attachment not there.....
Current can't be "dropped" the way voltage can. Current is a flow. All the current that flows into a circuit also flows back out.
With a linear regulator, the current coming in is equal to the load current plus a small amount of current that the regulator uses internally for its own operation, which comes out the ground pin of the regulator and is called the ground current. For a 78xx regulator the ground current is about 5~10 mA.
The load (the circuit that is being powered from the regulator) will draw whatever current it needs, and the current going into the regulator will be the sum of that current and the regulator's ground current.
Kris answered this correctly.No duh? The attachement is the load. I don't have enough to send to everyone.
You mean you have a power source, a series resistor, and the load? The load will draw current, which will cause a voltage drop across the resistor according to Ohm's Law. The current will be the same at all points in the circuit, but because the resistor will drop some voltage, there will be less voltage across the load, so the load may draw less current than it would otherwise. (It depends on what the load is; you should have told us that.)What if I don't use a regulator? Suppose it is just an external power source and a resistor
You can't "drop" current, as I said before. The same current will flow through all points in the circuit. If your power supply is rated at 2A that just means that you shouldn't draw more than 2A from it otherwise it may overheat, shut down, or be damaged.I have a nice variable voltage supply, but it is rated a 2 Amps. Will the circuit have to drop the whole 2 amps?
You mean you have a power source, a series resistor, and the load? The load will draw current, which will cause a voltage drop across the resistor according to Ohm's Law. The current will be the same at all points in the circuit, but because the resistor will drop some voltage, there will be less voltage across the load, so the load may draw less current than it would otherwise. (It depends on what the load is; you should have told us that.)
Ohm's Law applied to the series resistor says:
V = I × R where
V is the voltage dropped across the series resistor, in volts;
I is the current through the resistor (the same current flows at all points in the circuit), in amps;
R is the resistance of the series resistor, in ohms.
You can't "drop" current, as I said before. The same current will flow through all points in the circuit. If your power supply is rated at 2A that just means that you shouldn't draw more than 2A from it otherwise it may overheat, shut down, or be damaged.
Only thing that matters is you have an adjustable voltage power supply capable of providing (up to) 2 A to whatever load resistor you connect to it. The current the power supply provides to the load is I = E/R where E is the voltage in volts, R is the resistance in ohms, and I is the current in amperes, with a maximum value of 2 A determined by the maximum current the power supply can provide. So, yes, the resistor matters. If you don't connect the load resistor, the power supply provides zero current.
Careful there...we offer our help from a vast knowledge base and if you choose to use words like "attached circuit" it is meant normally to mean a circuit diagram.No duh? The attachement is the load. I don't have enough to send to everyone.
Careful there...we offer our help from a vast knowledge base and if you choose to use words like "attached circuit" it is meant normally to mean a circuit diagram.
Good advice there from Hop as usual. But I should explain why he used "I = E / R" instead of "I = V / R" for Ohm's Law.
E was the original letter used for voltage. It stands for EMF, "electromotive force". I learnt Ohm's Law as "I = E / R" too. But at some time around the 1970s or 1980s I think, the convention was changed so voltage is represented by V. That's the only difference.
It might help if you could show us how you "do calibration references." Just because your power supply can provide 2A doesn't mean it has to! OTOH, if you need a "calibrated" current source in the range of 100 pA to 1000 mA with a compliance voltage between zero and 1 VDC, a variable voltage power supply may not be the best choice. Please describe in some detail what you want to do; maybe someone here will have a suggestion on how to do it. Try to avoid the pitfall "if all you have is a hammer, everything looks like a nail." Sometimes you need another tool.
Rory,
What Kris and Hop are trying to do is get you to state things more accurately, so that we are all on the same page. Its saves so much confusion and having to ask additional questions to figure out what is being spoken about
Don't take it as a personal attack, as you did above. Rather take it as good advice so that you have a chance to learn new things. This will help you keep up with today's state of play and make everything much easier for everyone who are asking and answering questions
Dave