EasyGoing1
- Nov 27, 2014
- 16
- Joined
- Nov 27, 2014
- Messages
- 16
Hello,
My skills in electronics are mediocre at best. I received my CET back in 1993 and haven't really used the knowledge since. My bread and butter comes from computer networking.
I am trying to build a fairly simple power supply for charging devices such as iPones, iPads etc. I am aware of the requirements on pins 2 and 3 of the USB cable, and how they relate to 'instructing' an Apple device when it comes to how much current to draw from the power supply. So, I was basically considering using one of those DC to DC voltage circuits sold on Amazon for a few bucks to take a stable 12 volt source and bring it down to 5 volts, then make two division points using resistors to provide 2.8 and 2 volts respectively from the 5, so I could place 5 volts on pin four, 2 volts on pin three, 2.8 volts on pin two then ground on pin 1 (that pin order might actually be reversed but you get the point I hope).
What dawned on me today, is that when I chose resistance values (which I have yet to figure out those values - I've been playing with the formula Vi = Vt(Ri/Rt) but still haven't been able to make it work because I'm obviously doing something wrong. But as I was reading a breakdown of voltage divider circuits on allaboutcircuits.com, I began to think that the resistances that I chose should be extremely small due to the fact that these devices need to draw up to 2.1 amps of current, and if the resistances are over 1 ohm, the current in the circuit (before connecting a device to it) will end up being less than 1 amp, and this won't do for my purposes.
So my question is ... is my thought process correct? or will the current in - lets say a three resistor series circuit change significantly when a device is attached across a single resistor, realizing that the net resistance across a single resistor will be reduced because of the parallel resistance effect when attaching a load across one of the resistors? And if that is the case, would it not be better to chose resistances in the thousands of ohms, so as to not burn a ton of energy when the power supply has power applied to the input while there is no device plugged in to be charged?
Any feedback or comments on this would be greatly appreciated.
Sincerely,
Michael Sims
My skills in electronics are mediocre at best. I received my CET back in 1993 and haven't really used the knowledge since. My bread and butter comes from computer networking.
I am trying to build a fairly simple power supply for charging devices such as iPones, iPads etc. I am aware of the requirements on pins 2 and 3 of the USB cable, and how they relate to 'instructing' an Apple device when it comes to how much current to draw from the power supply. So, I was basically considering using one of those DC to DC voltage circuits sold on Amazon for a few bucks to take a stable 12 volt source and bring it down to 5 volts, then make two division points using resistors to provide 2.8 and 2 volts respectively from the 5, so I could place 5 volts on pin four, 2 volts on pin three, 2.8 volts on pin two then ground on pin 1 (that pin order might actually be reversed but you get the point I hope).
What dawned on me today, is that when I chose resistance values (which I have yet to figure out those values - I've been playing with the formula Vi = Vt(Ri/Rt) but still haven't been able to make it work because I'm obviously doing something wrong. But as I was reading a breakdown of voltage divider circuits on allaboutcircuits.com, I began to think that the resistances that I chose should be extremely small due to the fact that these devices need to draw up to 2.1 amps of current, and if the resistances are over 1 ohm, the current in the circuit (before connecting a device to it) will end up being less than 1 amp, and this won't do for my purposes.
So my question is ... is my thought process correct? or will the current in - lets say a three resistor series circuit change significantly when a device is attached across a single resistor, realizing that the net resistance across a single resistor will be reduced because of the parallel resistance effect when attaching a load across one of the resistors? And if that is the case, would it not be better to chose resistances in the thousands of ohms, so as to not burn a ton of energy when the power supply has power applied to the input while there is no device plugged in to be charged?
Any feedback or comments on this would be greatly appreciated.
Sincerely,
Michael Sims