Hi,
I (think) I understand the concept of a voltage drop across a resistor- it is the voltage needed to supply the amount of current flowing in a circuit with the resistor in place?
But, when I was powering a motor, I set the power supply to produce 220 Volts. Yet when I increased the torque on the motor, the supply voltage decreased. I thought that the currant should have changed, not the voltage, as a power supply is meant to give a set amount of volts? Why does this happen?
Many thanks.
I (think) I understand the concept of a voltage drop across a resistor- it is the voltage needed to supply the amount of current flowing in a circuit with the resistor in place?
But, when I was powering a motor, I set the power supply to produce 220 Volts. Yet when I increased the torque on the motor, the supply voltage decreased. I thought that the currant should have changed, not the voltage, as a power supply is meant to give a set amount of volts? Why does this happen?
Many thanks.