Larry Brasfield said:

Also agree with 1 and 2. Item 1 is very real and makes a very tangible and

signicant impact and should not be ignored. The influence of item 2 however

is usually quite small for a normal transformer operated with a normal

temperature range. Since most loss and thermal rise calculations are

somewhat approximate anyway, item #2 can often be ignored.

3. Core and eddy current losses in the magnetic core will increase.

Increased output current (which implies increased input current) will

increase the magnetic flux density. As you increase it, the losses in

the core will increase, up to magnetic saturation, where you can

effectively get no more current (the maximum energy through a

transformer is limited by the magnetics as well as the winding limits).

It is not true that magnetizing losses increase with

output current. They actually go down a little.

This is because the increased IR drop in the primary

reduces the amount of flux change necessary in the

core to provide enough induced voltage to equal the

applied primary voltage adjusted by the IR drop.

Agreed.

For magnetization, you can think of the primary as a

more or less pure inductor [1] in series with the primary

resistance and in parallel with some real impedance

representing the eddy current, hysteresis, (and radiated)

losses. As the voltage across that inductor drops, so do

its losses.

[1. The inductor is usually non-linear, but its inductance

is a monotonic function of current.]

I suppose that is one way to think about it. I think of it a little

differently. Ampere's Law would have you believe the magnetic flux density

B is proportional to the number of turns times the current flowing through

those turns in any given magnetic device. Since a transformer is a magnetic

device, it would seem logical that as the output current increases the

magnetic flux density in the core also increases. This would suggest at

some current level the transformer's core would saturate.

This is not the case however for a regular transformer (IE: one not a

flyback transformer, they are different). What one must realize is that a

transformer has two or more independent windings on a single core. Ampere's

Law applies to the primary winding, and it also applies for the secondary.

As the load current on the secondary increases you would tend to get more

flux generated in the core. However, as the primary current increases to

supply that secondary current, the primary winding also generates it's own

flux. If you studiously apply the right hand rule, you will find that the

flux will be in different directions for primary and secondary, and so they

both serve to cancel each other out. As a consequence the flux density in

the core is essentially independent of the load current of the transformer.

In effect the maximum power output rating of a transformer is limited by the

winding resistances, or by total thermal dissipation limits resulting in a

given temperature rise. Practical transformers are thermal dissipation

limited long before winding resistance limited. In theory a transformer

made with superconducting windings could be made very small and output an

outrageously huge amount of power (efficiently at that).

Flyback transformers are different, and have properties more like inductors

than do regular transformers. In the flyback transformer current does not

normally flow simultaneously through primary and secondary windings. At any

given time only one of them conducts. As a consequence you don't get the

flux canceling effect mentioned above for regular transformers. If you keep

increasing the load on a flyback transformer it will eventually saturate the

core.

As for the OP's original question, does the efficiency improve with

increasing or decreasing load current... Obviously at zero output current

the efficiency is zero since any transformer will waste some idle power

primarily due to core hysteresis and eddy current loss. As you apply a

heavier and heavier load the efficiency continues to improve, up to a point.

At some point the I^2*R loss effect will start to dominate and thus the

efficiency will start to decrease again. In practice, where this peak of

efficiency occurs depends on the design of the transformer. Typical

transformers will often be designed to have maximum efficiency somewhat near

(though often a little below) their maximum rated continuous output current.

The efficiency peak is relatively broad.