X
Xtrchessreal
- Jan 1, 1970
- 0
You cannot have a current without something to push the electrons off
of their covalant buns. That is done one of two ways either applying a
Magneto-motive force or a Electro-motive force. I don't know of the
application off hand but I suppose you can have both at the same time
as well, each applying a portion of the force to make the electron
move.
On a 20 amp circuit if you short the line to the neutral you have a
current of 20 amps for a moment before the breaker opens. 120 VAC,
2400 watts, 20 amps.
If you wanted to keep the breaker from opening you could stick a
resistor across the line and neutral. The resistor would need to be
able to dissapate many watts without burning out. The idea is to lower
the current so that you don't open the breaker at its rated value 20
amps of current.
Say you want to reduce the current to only 10 amps you would need a 12
ohm resistor capable of handling 1200 watts of dissapation. Or you
could place 1200 one watt resistors in parallel that are equiv to one
12 ohm resistor. On the neutral side of the resistor you would measure
0 volts and on the line side you would measure 120 volts.
Of course if the 120 VAC 20 amp circuit was not able to deliver 2400
watts then the supply has the problem of being out of design spec. IOW
the supply cannot deliver a 20 amp current as it was specified.
When designing a new electronic device you need to know the total
amount of power dissapation of the device especially if you intend to
build a DC power supply for the delicate circuitry within. The DC
power supply needs to be designed for the total dissapation and then
you need to also know the maximum current that can be supplied by the
power supply.
I am just writing out my thoughts as I try to understand these things.
I am not going in any particular direction except for one.
The resistor is a device that limits current for a circuit and that is
the way I understand it. When its fuction is to create a voltage
inside a circuit I get confused. I get confused because the
terminology is twisted and goes against normal thinking.
Specifically, current is not possible without voltage or induction so
how can you have a voltage be created by a resistor that is designed to
limit current?
I know this is done in many ways and and is done all the time. Maybe
it would help to know what the power dissapation is for some of the
devices in an amplifier circuit using an Op amp while it is in its
linear operation and then its maximum dissapation. Then I can think of
it like I think of the 120 VAC in my house on a 20 amp breaker - there
is a limit to the Current and the Voltage and it is easy to figure the
circuitry.
Somehow the books miss explaining things that should be easy to
understand. When a Op amp is shown with its supply voltage inputs the
should also state what the maximum current available is. That way you
can use either Current or Voltage to help you solve a circuit problem.
of their covalant buns. That is done one of two ways either applying a
Magneto-motive force or a Electro-motive force. I don't know of the
application off hand but I suppose you can have both at the same time
as well, each applying a portion of the force to make the electron
move.
On a 20 amp circuit if you short the line to the neutral you have a
current of 20 amps for a moment before the breaker opens. 120 VAC,
2400 watts, 20 amps.
If you wanted to keep the breaker from opening you could stick a
resistor across the line and neutral. The resistor would need to be
able to dissapate many watts without burning out. The idea is to lower
the current so that you don't open the breaker at its rated value 20
amps of current.
Say you want to reduce the current to only 10 amps you would need a 12
ohm resistor capable of handling 1200 watts of dissapation. Or you
could place 1200 one watt resistors in parallel that are equiv to one
12 ohm resistor. On the neutral side of the resistor you would measure
0 volts and on the line side you would measure 120 volts.
Of course if the 120 VAC 20 amp circuit was not able to deliver 2400
watts then the supply has the problem of being out of design spec. IOW
the supply cannot deliver a 20 amp current as it was specified.
When designing a new electronic device you need to know the total
amount of power dissapation of the device especially if you intend to
build a DC power supply for the delicate circuitry within. The DC
power supply needs to be designed for the total dissapation and then
you need to also know the maximum current that can be supplied by the
power supply.
I am just writing out my thoughts as I try to understand these things.
I am not going in any particular direction except for one.
The resistor is a device that limits current for a circuit and that is
the way I understand it. When its fuction is to create a voltage
inside a circuit I get confused. I get confused because the
terminology is twisted and goes against normal thinking.
Specifically, current is not possible without voltage or induction so
how can you have a voltage be created by a resistor that is designed to
limit current?
I know this is done in many ways and and is done all the time. Maybe
it would help to know what the power dissapation is for some of the
devices in an amplifier circuit using an Op amp while it is in its
linear operation and then its maximum dissapation. Then I can think of
it like I think of the 120 VAC in my house on a 20 amp breaker - there
is a limit to the Current and the Voltage and it is easy to figure the
circuitry.
Somehow the books miss explaining things that should be easy to
understand. When a Op amp is shown with its supply voltage inputs the
should also state what the maximum current available is. That way you
can use either Current or Voltage to help you solve a circuit problem.