Maker Pro
Maker Pro

I don't understand the importance of high voltage-low current inelectricity distribution

My problem is this: You are distributing 100 watts. You start out with
10 volts, 10 amps. You increase the volts by a factor of 2 through a
transformer. Now you must have 20 volts, 5 amps right? Using Ohm's
Law, 10 volts = 10 amps * 1 ohms, in the first case there is 1 ohm.
But, in the next case you have 20 volts = 5 amps * 4 ohms. So why did
the ohms change? When you use a transformer to increase voltage,
doesn't that mean ohms must increase if amps decrease? I am thoroughly
confused. I know I am thinking about it the wrong way, can someone
explain it to me? If you want me to clarify, let me know. Basically my
question is this: How do you create more voltage without creating more
current?
 
P

Phil Allison

Jan 1, 1970
0
My problem is this: You are distributing 100 watts. You start out with
10 volts, 10 amps. You increase the volts by a factor of 2 through a
transformer. Now you must have 20 volts, 5 amps right? Using Ohm's
Law, 10 volts = 10 amps * 1 ohms, in the first case there is 1 ohm.
But, in the next case you have 20 volts = 5 amps * 4 ohms. So why did
the ohms change? When you use a transformer to increase voltage,
doesn't that mean ohms must increase if amps decrease? I am thoroughly
confused. I know I am thinking about it the wrong way, can someone
explain it to me? If you want me to clarify, let me know. Basically my
question is this: How do you create more voltage without creating more
current?


** You are confusing yourself by * failing to separate * load ohms with the
ohms in the current carrying cable going to the load.

High voltage transmission is ALL about reducing the percentage of power lost
in the cables that deliver electrical energy across a country. .

http://en.wikipedia.org/wiki/Electric_power_transmission#Losses



....... Phil
 
J

Jamie

Jan 1, 1970
0
My problem is this: You are distributing 100 watts. You start out with
10 volts, 10 amps. You increase the volts by a factor of 2 through a
transformer. Now you must have 20 volts, 5 amps right? Using Ohm's
Law, 10 volts = 10 amps * 1 ohms, in the first case there is 1 ohm.
But, in the next case you have 20 volts = 5 amps * 4 ohms. So why did
the ohms change? When you use a transformer to increase voltage,
doesn't that mean ohms must increase if amps decrease? I am thoroughly
confused. I know I am thinking about it the wrong way, can someone
explain it to me? If you want me to clarify, let me know. Basically my
question is this: How do you create more voltage without creating more
current?
I think you have your thinking of the actual question given to you a
little mixed up.
To put it simply as one of the biggest factors are, less conductor
material to relay the energy to the end point.

HV circuits like 5k, 12k etc. use very small conductors to transfer
the energy to the customer. The current is less of the same ratio.
At the end point, it then gets down converted into voltage where the
current service will increase and voltage drops. At this point the
conductors will have to be increased because a change in ohms in the
conductors have more lossy effects at lower voltages verses the
higher voltages.

etc..
It's really all about the use of material mass in the end.

if you were to attempt to transfer high levels of power at low
voltages for great distances, the amount of copper used would be un
thinkable like in supplying great cities for their electrical needs.
 
V

Varactor

Jan 1, 1970
0
My problem is this: You are distributing 100 watts. You start out with
10 volts, 10 amps. You increase the volts by a factor of 2 through a
transformer. Now you must have 20 volts, 5 amps right? Using Ohm's
Law, 10 volts = 10 amps * 1 ohms, in the first case there is 1 ohm.
But, in the next case you have 20 volts = 5 amps * 4 ohms. So why did
the ohms change? When you use a transformer to increase voltage,
doesn't that mean ohms must increase if amps decrease? I am thoroughly
confused. I know I am thinking about it the wrong way, can someone
explain it to me? If you want me to clarify, let me know. Basically my
question is this: How do you create more voltage without creating more
current?

They key is to understand that losses go with i^2, so reducing i (by
raising V in proportion since P=iV) is a good thing.

Cheers
 
They key is to understand that losses go with i^2, so reducing i (by
raising V in proportion since P=iV) is a good thing.

Cheers
OK you guys have helped me but what I still don't get is: Why does the
equation V^2/R produce a different power than I^2*R, and how do you
reduce I and increase V in the equation P=IV without changing the ohms
of resistance? I think that the *load*, for instance, the appliances
in the house, affect the current right? So that would be the ohms
changing? So in P=VI, P is the losses of the load you are powering,
and in P=I^2*R that is the resistance of the wire? How do you know
whether P is losses or what? I think I'm starting to understand.
 
M

Michael Black

Jan 1, 1970
0
OK you guys have helped me but what I still don't get is: Why does the
equation V^2/R produce a different power than I^2*R, and how do you
reduce I and increase V in the equation P=IV without changing the ohms
of resistance? I think that the *load*, for instance, the appliances
in the house, affect the current right? So that would be the ohms
changing? So in P=VI, P is the losses of the load you are powering,
and in P=I^2*R that is the resistance of the wire? How do you know
whether P is losses or what? I think I'm starting to understand.
Without fully following the thread, your original use of ohms law
wasn't about the voltage drop, but was your expectation of a certain
resistance because you had voltage and current.

You are actually dealing with the voltage drop along the cable, or the
resistance of that cable which will cause a voltage drop.

The higher the current through a circuit, the more affect that resistance
has. Lower the current, and the resistance of the cable becomes less of a
factor.

But of course, lower the current and you can't supply as much power to the
load. Which is why they raise the voltage to compensate for the lower
current, the power in wattage being passed along the cable being the same
if both are changed by the same factor.

Try a different angle. In the days of tubes, the voltages were all high
voltage, while the current levels were really quite low. Except for
circuits where really high power was used, like transmitters, you rarely
saw large diameter wire in the wiring, since it didn't need to pass much
current, and the resistance of that narrow diameter wire was not a factor.
For a lot of equipment, the power supply would offer up 350v, if that
much, but the current drain would never be more than a few hundred
milliamps.

Then along came solid state devices. They all ran at quite low voltage,
but the current drain was pretty high. So 12 volts or even 5volts, but
it was often common to see an amp or so needed. Suddenly, you had to
be careful of the wiring, because the resistance of the narrower gauge
wire would become a factor. Bad connectors too, if they didnt' make
good contact their resistance would be more significant. The resistance
of the #20 wire or whatever was used did not change from when it was
used in tube circuits, but if it had a resistance of 1ohm in the tube
equipment, the 1ohm in solid state equipment might start being a problem
because of the needed current passed through it.

Michael
 
Thank you so much! I haven't actually read all the posts yet, but
understanding that it is the voltage DROP clears things up for me. You
guys are so helpful! I'm pretty new to electronics/electricity, but
I'm starting to understand it much better. I'll probably have more
questions in the future.
 
Also, my problem was that I KNEW the equations work, and I KNEW you
couldn't change those without resistance, I did that with the
equations, but now I get it!
 
R

Rich Grise

Jan 1, 1970
0
My problem is this: You are distributing 100 watts. You start out with
10 volts, 10 amps. You increase the volts by a factor of 2 through a
transformer. Now you must have 20 volts, 5 amps right? Using Ohm's
Law, 10 volts = 10 amps * 1 ohms, in the first case there is 1 ohm.
But, in the next case you have 20 volts = 5 amps * 4 ohms. So why did
the ohms change? When you use a transformer to increase voltage,
doesn't that mean ohms must increase if amps decrease? I am thoroughly
confused. I know I am thinking about it the wrong way, can someone
explain it to me? If you want me to clarify, let me know. Basically my
question is this: How do you create more voltage without creating more
current?

Ohms don't enter into it for this calculation, except for the resistance
of the transmission wires themselves.

Say you've got 1000 feet of AWG 10, resistance .9989 ohms/1000 ft. which
is close enough to 1 ohm for this discussion
(source: http://www.thelearningpit.com/elec/tools/tables/Wire_table.htm }.

So, with a 10 volt, 10 A supply, the one ohm of wire will drop the whole
10 volts, leaving nothing for the load (actually, you'd get a voltage
divider consisting of the wire resistance and the actual resistance of
the load, but let's set that aside for now.)

With a 20 volt, 5A supply, the line drop is only 5 volts, leaving 15 volts
for the load. With a 100V, 1A supply, the line drop is 1V, leaving 99V
for the load, and so on.

The load itself has a resistance, which is, indeed, R = E/I; if you
apply Ohm's law to the supply, it's referred to as "impedance", but
that's a whole nother topic as well.

The point is, the higher voltage/lower current supply incurs less
IR (E = IR) losses in the lines themselves.

Hope This Helps!
Rich
 
Top