Maker Pro
Maker Pro

Multimeter or Oscilliscope

J

jason

Jan 1, 1970
0
Hi, all
I want to measure a DC output precisely. Do I use a multimeter or
oscilliscope?
I found there always are some minor voltage differences between the
two equipments?
Which one is better for DC output voltage measurement?
Any thoughts are welcome?
BTW: papers or weblinks are also OK.
Thank you.
 
M

Michael Black

Jan 1, 1970
0
Hi, all
I want to measure a DC output precisely. Do I use a multimeter or
oscilliscope?
I found there always are some minor voltage differences between the
two equipments?
Which one is better for DC output voltage measurement?
Any thoughts are welcome?
BTW: papers or weblinks are also OK.
Thank you.
A meter is infinitely better. On a scope, you have to adjust the input
level, and then count markings on the graticule. Unless you are measuring
a very miniscule voltage, the markings will represent a significant jump
in voltage, so you'll always be wondering whether the deflection is closer
to the half way point between graticules or the next one up or next one
down.

A multimeter is intended to measure voltage. You get a much nicer scale
with lots of markings. And given that most multimeters these days are
digital, no parallax as you try to decide where the needle is resting.

The only place where an oscilliscope is better is when measuring complex
waveforms, because that will likely mess up the readings on a multimeter.
On a scope, you see the waveform and can take it all in. Of course,
nowadays lots of scopes have better frequency response than a multimeter,
so they are generally better than the average multimeter to measure AC
above a fairly low frequency. But in both cases, you'll have to puzzle
over which marking on the graticule the waveform reaches.

For DC, neither of those exceptions come into play. There is no reason
to use a scope to measure DC.

Why the same voltage reads differently on two pieces of equipment can
vary quite a bit. Sometimes it's parallax, you can't read exactly where
the needle is pointing. Sometimes it's because the units are not properly
calibrated, or calibrated to unequal voltage "standards". Sometimes it's
because the unit doesn't have high impedance input, so it loads the
circuit down and thus changes the reading. Sometimes it is because a
given piece of test equipment can't be as accurate as another.

Michael
 
P

Phil Allison

Jan 1, 1970
0
"jason"
I want to measure a DC output precisely. Do I use a multimeter or
oscilliscope?


** What " oscilliscope " IS that then ??

A real one with a CRT ??

A virtual one in software ???

A digital one with on screen voltage and frequency display ????


Do try to be more careful when you throw terms around.





...... Phil
 
J

jason

Jan 1, 1970
0
A meter is infinitely better.  On a scope, you have to adjust the input
level, and then count markings on the graticule.  Unless you are measuring
a very miniscule voltage, the markings will represent a significant jump
in voltage, so you'll always be wondering whether the deflection is closer
to the half way point between graticules or the next one up or next one
down.

A multimeter is intended to measure voltage.  You get a much nicer scale
with lots of markings.  And given that most multimeters these days are
digital, no parallax as you try to decide where the needle is resting.

The only place where an oscilliscope is better is when measuring complex
waveforms, because that will likely mess up the readings on a multimeter.
On a scope, you see the waveform and can take it all in.  Of course,
nowadays lots of scopes have better frequency response than a multimeter,
so they are generally better than the average multimeter to measure AC
above a fairly low frequency.  But in both cases, you'll have to puzzle
over which marking on the graticule the waveform reaches.

For DC, neither of those exceptions come into play.  There is no reason
to use a scope to measure DC.

Why the same voltage reads differently on two pieces of equipment can
vary quite a bit.  Sometimes it's parallax, you can't read exactly where
the needle is pointing.  Sometimes it's because the units are not properly
calibrated, or calibrated to unequal voltage "standards".  Sometimes it's
because the unit doesn't have high impedance input, so it loads the
circuit down and thus changes the reading.  Sometimes it is because a
given piece of test equipment can't be as accurate as another.

   Michael

Thank you very much!
 
D

David L. Jones

Jan 1, 1970
0
jason said:
Hi, all
I want to measure a DC output precisely. Do I use a multimeter or
oscilliscope?
I found there always are some minor voltage differences between the
two equipments?
Which one is better for DC output voltage measurement?
Any thoughts are welcome?
BTW: papers or weblinks are also OK.

The multimeter is much more accurate, by around an order of magnitude (10
times). It's the proper tool for the job here.

A typical low end digital multimeter will have say 0.5% or better basic DC
volts accuracy.
An oscilloscope on the other hand (even the really expensive ones) will only
have around several percent basic accuracy on the vertical channel at best.

The exception to this are oscilloscopes that have a special digital
multimeter modules built into them.

The product specs will tell you everything you need to know.

Dave.
 
P

Phil Allison

Jan 1, 1970
0
"Michael Black"
For DC, neither of those exceptions come into play. There is no reason
to use a scope to measure DC.


** Except for the VERY important reason of establishing that the voltage IS
in fact pure DC and does not have an AC signal superimposed.

Cos you often cannot easily tell that with a multimeter.




........ Phil
 
Top