J
Johnny Looser
- Jan 1, 1970
- 0
I am trying to understand why it is more effiecient to transfer
electrons at HV, and low current, than the reverse. Intuitivly is seems
very simple that power losses would be much less if there were fewer
electrons bumping into copper atoms on their way to houses everywhere.
However when I try to do the math I get lost. The standard explanation I
get is that P = I^2*R, therefore power losses are proportional to the square
of the current. So by stepping down the current, savings are realized.
However, P also = E^2/r. It doen't make sense (to me) that I^2R is always
< E^2/R.
Can show me how to prove this mathematically?
Here is an example I was trying to work out. Say at a house, there slurpy
machine that operates at 120 VAC, 5A, 600W. Let's say the copper runs a couple
miles and has a total resistance of 50 Ohms.
Case A) Just enough V to get by. In order to compensat for line losses, you
would need 120 VAC + 5A * 50 Ohms = 370 VAC at the genny. This would mean line
losses would be 250 VAC * 5A = 1250 W.
This is a lot. However my confusion comes when you start adding transformers
to the mix.
Case B) Use a 1:100 step down Xfmr. At secondary attached to the slurpy
machine you would have 120 Vac, 5A. At the primary attached to the power
line you would have 120VAC * 100 = 12000 VAC, and 5A/100 = 50 mA. In order to
compensate for losses, you would have 120000 + .05A * 50 = 120003 VAC at the
genny. Losses = 0.5A * 3VAC = 1.5W. Or calulated using P = II*R,
.25A^2 * 50 Ohms = 12.5W. Or yet still using P = EE/R, 9VAC^2 /50 = 1.62W.
I realize I have a problem because this scenario say the power line is non-ohmic
in that E!=IR, 3VAC != .05A * 50Ohms. But How do I make everything work out.
I realize that I do not have a firm understanding of how a transformer works.
What does it mean to have the secondary producing 100V,at 0.5A into a 33 Ohm load?
Can you help me. I don't want to be this stupid forever. Please help!
Thanks!
-Remove @_, when replying via email
electrons at HV, and low current, than the reverse. Intuitivly is seems
very simple that power losses would be much less if there were fewer
electrons bumping into copper atoms on their way to houses everywhere.
However when I try to do the math I get lost. The standard explanation I
get is that P = I^2*R, therefore power losses are proportional to the square
of the current. So by stepping down the current, savings are realized.
However, P also = E^2/r. It doen't make sense (to me) that I^2R is always
< E^2/R.
Can show me how to prove this mathematically?
Here is an example I was trying to work out. Say at a house, there slurpy
machine that operates at 120 VAC, 5A, 600W. Let's say the copper runs a couple
miles and has a total resistance of 50 Ohms.
Case A) Just enough V to get by. In order to compensat for line losses, you
would need 120 VAC + 5A * 50 Ohms = 370 VAC at the genny. This would mean line
losses would be 250 VAC * 5A = 1250 W.
This is a lot. However my confusion comes when you start adding transformers
to the mix.
Case B) Use a 1:100 step down Xfmr. At secondary attached to the slurpy
machine you would have 120 Vac, 5A. At the primary attached to the power
line you would have 120VAC * 100 = 12000 VAC, and 5A/100 = 50 mA. In order to
compensate for losses, you would have 120000 + .05A * 50 = 120003 VAC at the
genny. Losses = 0.5A * 3VAC = 1.5W. Or calulated using P = II*R,
.25A^2 * 50 Ohms = 12.5W. Or yet still using P = EE/R, 9VAC^2 /50 = 1.62W.
I realize I have a problem because this scenario say the power line is non-ohmic
in that E!=IR, 3VAC != .05A * 50Ohms. But How do I make everything work out.
I realize that I do not have a firm understanding of how a transformer works.
What does it mean to have the secondary producing 100V,at 0.5A into a 33 Ohm load?
Can you help me. I don't want to be this stupid forever. Please help!
Thanks!
-Remove @_, when replying via email