Hi, all
I want to measure a DC output precisely. Do I use a multimeter or
oscilliscope?
I found there always are some minor voltage differences between the
two equipments?
Which one is better for DC output voltage measurement?
Any thoughts are welcome?
BTW: papers or weblinks are also OK.
Thank you.
A meter is infinitely better. On a scope, you have to adjust the input
level, and then count markings on the graticule. Unless you are measuring
a very miniscule voltage, the markings will represent a significant jump
in voltage, so you'll always be wondering whether the deflection is closer
to the half way point between graticules or the next one up or next one
down.
A multimeter is intended to measure voltage. You get a much nicer scale
with lots of markings. And given that most multimeters these days are
digital, no parallax as you try to decide where the needle is resting.
The only place where an oscilliscope is better is when measuring complex
waveforms, because that will likely mess up the readings on a multimeter.
On a scope, you see the waveform and can take it all in. Of course,
nowadays lots of scopes have better frequency response than a multimeter,
so they are generally better than the average multimeter to measure AC
above a fairly low frequency. But in both cases, you'll have to puzzle
over which marking on the graticule the waveform reaches.
For DC, neither of those exceptions come into play. There is no reason
to use a scope to measure DC.
Why the same voltage reads differently on two pieces of equipment can
vary quite a bit. Sometimes it's parallax, you can't read exactly where
the needle is pointing. Sometimes it's because the units are not properly
calibrated, or calibrated to unequal voltage "standards". Sometimes it's
because the unit doesn't have high impedance input, so it loads the
circuit down and thus changes the reading. Sometimes it is because a
given piece of test equipment can't be as accurate as another.
Michael