A PWM controller is like a very fast switch. It gives either full power to the load or switches it off. It does this very fast and with a controlled ratio between the time that power is applied and when it is not.

Ignoring resistance in the controller itself, the current through the load (and for simplicity I'm assuming the load is purely resistive) will either be a value determined by the supply voltage divided by the load resistance, or zero.

If you placed an analog voltmeter across the load, it would measure a voltage that varied from zero to your supply voltage as the ratio between the on and off times (the duty cycle) varies. The reading would vary, not because the voltage was actually varying in a linear manner, but because the meter could not swing from zero to the supply voltage and back to zero again as the PWM controller switches it on and off. The voltage read on the voltmeter will be some sort of average determined by the physical characteristics of the meter. Incidentally, if the switching frequency just happens to be resonant with something in the meter, the meter can destroy itself.

An analog ammeter placed in series with the load will behave similarly.

Often the readings given by analog meters will be close to the arithmetic mean of the voltage or current. Let's assume that your load is operating from 19V, at which voltage it draws 4.5A and is being operated at a 50% duty cycle (it is on half of the time). The voltmeter will probably read close to 9.5V and the ammeter close to 2.25A. So, what is the power?

Normally, to determine the power, you multiply the voltage by the current. So, is 9.5 * 2.25 correct? Is the device dissipating 21.375W?

Thinking about it, when the load is on, 19V is across the load, and 4.5A is flowing through it. The power being dissipated is 85.5W. When the load is off, it dissipates 0W. At a 50% duty cycle it is thus dissipating 85.5W half of the time, and 0W the other half of the time. Because power is defined as Joules per second, it is fairly simple to show that the average dissipation is power x duty cycle, or 85.5W x 50%. Thus the calculated power is 42.75W.

So, measuring the average voltage and current and multiplying them gives an erroneous value!

But wait, if we took either the measured average voltage or current and multiplied it by the peak current or voltage (the average voltage 9.5V x the peak current 4.5A, or the peak voltage 19V x the average current 2.25A) you get the correct figure. That allows you to mark a voltmeter or an ammeter as a power meter!

Is the use of a meter like this to measure power useful? Actually it's not. Firstly, if the voltage supplied changes, any markings on the meter (power or % of power) will no longer be correct. And secondly, if you have a single turn rotary potentiometer controlling the duty cycle, markings on that will be just as good.

If you use a digital meter for either voltage or current, things may be more complex. You may get an average reading, you may get an RMS reading (actually the same in this case), you may get a reading that varies all over the place, or you might get something else. Unless you know, or can reliably test, you won't know. There best case is that you have a situation no worse than with analog meters.

If you are convinced that you want meters, then measuring the supply voltage (before the PWM controller) and the current through the load (and assuming it to be the average current) is your best bet.

There only reason I would consider a meter (and it would be a ammeter) is if I was designing my hot wire cutter to use different types and lengths of wire and I wanted to monitor the average current (because failure due to overheating will be related to current).