Anyway, here's my question. One thing I read was that the output
voltage of the supply is fed back to the PWM which changes it's duty
cycle accordingly to keep the output voltage constant. But I thought
that the input-to-output ratio of a transformer is fixed. If the PWM
is outputting 100V at 20KHz to a 10:1 transformer, you get out 10V at
Yes, the _instantaneous_ output voltage of the transformer is still 10V.
What does it matter what the duty cycle is? It's still
100V at 20KHz. What am I missing?
The output of the transformer is feed to some form of filtering -- usually a
series inductor and parallel capacitor -- and the output voltage then found
across the capacitor is the _average_ of the input voltage. So... 10V
instantaneous output from the transformer at 100% duty cycle gets you 10V
across the capacitor... 50% duty cycle would get you 10V instantaneous output
from the transformer for half the time and 0V for the other half of the time,
so this averages to 5V across the capacitor... etc...
(This is somewhat simplified; in actuality the voltage change a little due to
diode drops, active device losses, etc. -- the feedback loops jiggles the duty
cycle until the right regulated output voltage appears, though.)
Second, why does a switching power supply break without a load?
Because the designers are either (1) ignorant or (2) cheap.
OK, actually, not all switchers have no-load problems. What happens -- and
this is for the most basic example you could come up with, something like a
simple buck converter -- is that during the time that the transformer (or
inductor) is being fed current, flux (current) builds up in it. When the
switch controlling this current is turned off, the current (flux) in the core
starts dropping. The rate at which it drops is proportional to the load...
bigger load (lower impedance), faster current drop. With a "big enough" load,
the current goes all the way to zero. On the next switching cycle, the
current ramps up again to some current, then ramps down to zero, etc. -- this
Now, with a small load, while the current does drop, it doesn't go all the way
to zero before the next switching cycle. Now current ramps up again and -- if
one isn't careful in design -- the current ends up higher than it was at the
turn-off point of the last cycle. It drops a little again (but not to zero),
and now at the next turn on the current is driven even higher. Sooner or
later, the inductor saturates, which tends to look almost (but not quite) like
a short circuit to the driver. That driver starts having massive current run
through it, heats up, and sooner or later dies.
Hence, there's some minimum load that causes the switcher to change from
discontinuous to a continuous mode of operation, and for loads lighter than
this you can get into trouble.
If you think about it for a moment, it's clear you could simply detect the
current in the core and quit driving it when you start to approach
saturation -- this simple solution is what "current mode" switchers do, and
they usually don't have no-load problems. In fact, if you think about it even
more, even in the case with regular "voltage mode" feedback, since the current
in the core is increasing, the output voltage will as well, so the feedback
regulator should keep clamping down on the duty cycle so as to avoid
saturating the core. The problem here is that the 'transfer function' of the
power supply from input to output is different when it's operating in this
'continuous' mode rather than 'discontinuous' mode, and it takes more effort
to build a feedback network that can keep the entire 'loop' stable in both
modes (and without starting to become slightly sophisticated in your feedback
network, making a power supply no-load stable often degrades the step
response, which isn't desirable). Hence, for both the sake of cost (the extra
feedback network circuitry) or merely ignorance on the part of the designer,
some power supplies turn over and die when run without a minimum load.
Third, in all my years in electronics, I have never used a choke, now I
see them all over these power supplies. Can someone clue me in about
what they do, and why they are in these things?
Ummm... you know what capacitors do, right? You 'feed' them a current and
this causes charge to accumulate within them such that a voltage appears
across them? Inductors are their 'dual' -- you 'feed' them a voltage and this
causes flux to accumulate within them such that a current flows throughout
them. Hence, just like a capacitor, inductors can be used to store energy.
With an inductor, by varying the duty cycle of a voltage across it, you can
'charge up' the inductor to some arbitrary average current and then 'dump'
this into a load to get a corresponding voltage. This makes it easy to make
regulated _voltage_ power supplies, whereas a similar approach with capacitors
would get you a regulated _current_ power supply.
I'd suggest checking out Abraham Pressman's switching power supply book. It's
not cheap, but it's written by a guy who seemed far more intent on building
working power supplies than doing more theoretical research. It does of
course have some math in it, but even someone with one semester of caclulus
will probalby be able to follow it.