Why does the effciency of my low voltage transformer very much
decrease (to eg 10%) when increasing the frequency to around 40000hz.
Why does this increase in frequency affect the efficiency value so
Core loss is a part of it and core loss goes up with frequency, but flux
density goes down with increasing frequency which helps to ameliorate core
loss. Whatever the core losses are at a given frequency, they would be much
worse if the flux density were maintained at its low frequency value.
Another important loss is driving of the winding capacitance by the
resistance of the windings and by the impedance of the leakage inductance.
This can be a very important loss factor because the AC resistance of the
copper (not impedance) goes up frequency because of skin effect and may
become excessive at 40KHz. The capacitance was most likely ignored in a line
frequency transformer design but can be significant in a 40KHz transformer.
Furthermore higher voltage windings often have higher capacitances and
smaller wire with higher resistance exacerbating the loss effects.
Leakage inductance is the inductance that does not link the primary to the
secondary and therefore no transformer coupling occurs across it. It's just
a series impedance in the way of transferring power across the transformer.
It drives the winding capacitances as mentioned but also is in series with
the load reducing the transfer. Like all inductances, it's impedance
increases with frequency.