Maker Pro
Maker Pro

Multimeter accuracy -- 3d, 5d, 10d?

Justinicus

Jan 17, 2014
5
Joined
Jan 17, 2014
Messages
5
Howdy folks,

I've been searching for hours, and can't find out what these terms mean (3D, 5D, 10D). I figure it must be something so basic, no one ever bothers to define it.

I just picked up another cheapo 3.5 digit DMM to quickly check battery voltage and continuity without having to get out my "good" model. I was reading through the manual (I've never looked at a DMM manual before!) and came across a new term in the accuracy specs. Between 200mV and 200VDC, its accuracy is +- 0.5% +- 5D. At 200VAC, its accuracy is +- 1.2% +- 10D.

I thought it might be significant digits, but that doesn't seem hold up.

Any help for the noob?
 

duke37

Jan 9, 2011
5,364
Joined
Jan 9, 2011
Messages
5,364
My interpretation is that at 200V AC, it should read 200.0
It could read 1.2% high or low, i.e. 197.6 to 202.4
Add or subtract 10 digits gives 195.6 to 203.4

Check it on any critical range with your posh meter.
 

KrisBlueNZ

Sadly passed away in 2015
Nov 28, 2011
8,393
Joined
Nov 28, 2011
Messages
8,393
"±10D" means ± ten times the value of the least significant digit. So it's like Duke said except that the final range should be 196.6~203.4.
 

jpanhalt

Nov 12, 2013
426
Joined
Nov 12, 2013
Messages
426
Interesting how confusing such specifications can be, isn't it?

If we are talking about a 3-1/2-digit meter, it cannot display "202.4." It should read "out of range". 199.9 is the largest 3-1/2-digit value. I am assuming the calculation that was shown was just hypothetical to make the point. :)

As for the other part, e.g., a % error and a ± digit error, I agree, but thought there was an added caveat that whichever error calculation gave a larger error was applied. Some way to arbitrate the two notations is needed, because they will not always agree.

John
 
Last edited:

KrisBlueNZ

Sadly passed away in 2015
Nov 28, 2011
8,393
Joined
Nov 28, 2011
Messages
8,393
As for the other part, e.g., a % error and a ± digit error, I agree, but thought there was an added caveat that whichever error calculation gave a larger error was applied. Some way to arbitrate the two notations is needed, because they will not always agree.
AFAIK the maximum error is specified as the SUM of the percentage and least-significant-digit errors. I haven't seen it specified as "whichever is higher". This is what duke37 showed in his calculations.
 

jpanhalt

Nov 12, 2013
426
Joined
Nov 12, 2013
Messages
426
I need to read about it some more. A reading of 003.0 V with a 2% /10D meter in theory could be 3.0 ± .06 and ± 1V, which doesn't make sense considering the errors additively (i.e., it could read as high as 004.1V or as low as 001.9V). Even considering the greater of the errors doesn't make sense.

Is it auto ranging so that 003.0 is always read as 03.00?

John

EDIT:
I stand corrected. Here is the statement from Fluke:
Capture1.PNG

I guess I would have second thoughts about buying a ±10D meter.

John
 
Last edited:

(*steve*)

¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd
Moderator
Jan 21, 2010
25,505
Joined
Jan 21, 2010
Messages
25,505
I guess I would have second thoughts about buying a ±10D meter.

It depends on the source of the error.

If it is random then the last digit is almost meaningless, however if this is a limit to accuracy, but not to precision then whilst you can have no confidence in that last digit being correct, you can still rely on changes in that last digit meaning a change in the reading.

As an example, I have a very old Fluke meter (nixie tube display!) that allows me to set the interval over which the output is averaged. At high refresh rates the 5th digit is all over the place, as you slow it down, the variation in the readings get smaller and smaller. Whilst the reading may be no more accurate, it is a lot more precise.
 

jpanhalt

Nov 12, 2013
426
Joined
Nov 12, 2013
Messages
426
Your example is not quite the same thing. I thought we were discussing a device where you do not have that option of trading off response time for accuracy.

I, too, can name instruments whose accuracy compared to the majority of other instruments on the market was off by more than 100%, which was a function of the technology, but whose precision was the best on the market. Their precision and speed is what made them useful. Those instruments were for measuring enzymes for which universally accepted standards did not exist.

So far as I know, volts are volts. There is insignificant disagreement on the definition. :)

John
 

(*steve*)

¡sǝpodᴉʇuɐ ǝɥʇ ɹɐǝɥd
Moderator
Jan 21, 2010
25,505
Joined
Jan 21, 2010
Messages
25,505
It doesn't matter. What I was talking about is repeatability.

Sure, the meter may be relatively inaccurate, but it may be highly precise.
 
Top