Well, what about a constant voltage source that is independent of the load's resistance?
I understand its basic ohm's law, but some circuit designs seem to confuse me as to how more voltage can be outputted into the circuit with lower current. Using a converter for example.
But then again... a converter is not relevant to ohms law?
V=IR .
Typically, one of these values is pre-determined... like the resistance of a circuit and cannot be changed.
This leaves I and V that can be manipulated... Not at the same time though or the above formula would be false.
If you increase voltage, the current will increase as well. The only way around that is to somehow decrease the resistance of a circuit.
Same thing with manipulating current... if you add 200mA, the voltage will go up. Once more the only way around this is to change the circuit's resistance.
Here's the catch with a converter... Where do you connect it to the circuit?
If you give a converter 12V, it spits out 6V.... now the circuit will pull the current from the 6V side of the converter, which will have an inverse affect on the 12V side of the converter. (ie, pulling 1A from the 6V side, will only pull 0.5A into the 12V side)
This requires two formulas (or one if you merge them together)
V=IR on the circuit side. (Voltage = Current x Resistance)
P=IV to determine the current draw at the 12V side of the converter. (Power = Voltage x Current)