Maker Pro
Maker Pro

24-bit A-D converter - meaningless?

P

Peter

Jan 1, 1970
0
I've been doing analog stuff since the 1970s, used to design HV power
supplies stable to a few ppm, but it is still amazing to see what can
be achieved e.g.

http://www.analog.com/en/press-release/12_04_12_ADI_24bit_Sigma_Delta_AD_Converter/press.html

How how can this possibly work? 24 bits of noise-free performance,
never mind linearity, never mind any sort of absolute accuracy, seems
completely unachievable in the analog world.

I find that a 12-bit ADC inside a microcontroller might just about
give you 12 bits, though usually you get 10 useful bits and to get the
other 2 you have to take say 30 readings and average them.

Once away from a microcontroller but still on the same PCB, with
careful use of grounding and ground planes, and perhaps even a shield
over the relevant bits (and a ground plane on the back of the PCB) one
can use a 16-bit ADC and end up with only 1 or 2 bits of noise.

I would imagine that to get better than 16 bits one would need to put
the ADC in a shielded box, well away from any logic etc, but even the
clock+data lines are going to radiate noise inside. Maybe one is
supposed to bring them in over fibre. Maybe the timing is such that no
transitions on the control signals are necessary during the actual
conversion cycle?

At work we make a product which uses a 12-bit ADC (ADS7828) which we
calibrate to < 0.05% by using a precision voltage source, 0.1% 15ppm
0805 resistors and storing calibration coefficients in an EEPROM (I
saw the other thread here on 0.01% resistors) and we get pretty well
the full 12 bits out of that. I'd like to go to a 16-bit ADC one day
but I am very sure it won't give us more than maybe 2 extra bits that
mean anything...
 
I've been doing analog stuff since the 1970s, used to design HV power
supplies stable to a few ppm, but it is still amazing to see what can
be achieved e.g.

http://www.analog.com/en/press-release/12_04_12_ADI_24bit_Sigma_Delta_AD_Converter/press.html

How how can this possibly work? 24 bits of noise-free performance,
never mind linearity, never mind any sort of absolute accuracy, seems
completely unachievable in the analog world.

Where does that article talk about 24 noise free bits ?
Measured on a single channel, the AD7176-2 delivers
as many as 17.2 noise-free bits at 250 kSPS
and 22 noise-free bits at 5 SPS.

Low sampling rates and hence long integration times and low noise
measurement bandwidths seem to give figures the marketing people
like:)

Audio codecs are usually specified as 24 bit (apparently to be SPDIF
"compatible") at sampling rates of 48, 96 and even 192 kHz are
specified with 3 Hz - 20 kHz noise bandwidth, i.e. at higher sampling
rates, much of the quantization noise moves above this frequency band,
giving nicer looking figures. In addition the low frequency limit
nicely cuts out any 1/f noise issues at low frequencies.

Still the SNR at 20 kHz bandwidth is about 120 dB, corresponding to
about 20 correct bits.

Putting a 16 channel analog multiplexer in front of the AD chip, each
channel sampled at 15 kHz and you get less than 16 clean bits, not so
much better than a 12 bit A/D converters used in industry for decades.

At 5 SPS (typically used in scales) 20 - 24 bit claims has been made
for a long time.

You really have to be careful when reading data sheets written by
marketing people :)
 
J

John Devereux

Jan 1, 1970
0
Peter said:
I've been doing analog stuff since the 1970s, used to design HV power
supplies stable to a few ppm, but it is still amazing to see what can
be achieved e.g.

http://www.analog.com/en/press-release/12_04_12_ADI_24bit_Sigma_Delta_AD_Converter/press.html

How how can this possibly work? 24 bits of noise-free performance,
never mind linearity, never mind any sort of absolute accuracy, seems
completely unachievable in the analog world.

A HP 3458A can do this. (Well, absolute accuracy would depend on the
calibration).
I find that a 12-bit ADC inside a microcontroller might just about
give you 12 bits, though usually you get 10 useful bits and to get the
other 2 you have to take say 30 readings and average them.

Once away from a microcontroller but still on the same PCB, with
careful use of grounding and ground planes, and perhaps even a shield
over the relevant bits (and a ground plane on the back of the PCB) one
can use a 16-bit ADC and end up with only 1 or 2 bits of noise.

I would imagine that to get better than 16 bits one would need to put
the ADC in a shielded box, well away from any logic etc, but even the
clock+data lines are going to radiate noise inside. Maybe one is
supposed to bring them in over fibre. Maybe the timing is such that no
transitions on the control signals are necessary during the actual
conversion cycle?

At work we make a product which uses a 12-bit ADC (ADS7828) which we
calibrate to < 0.05% by using a precision voltage source, 0.1% 15ppm
0805 resistors and storing calibration coefficients in an EEPROM (I
saw the other thread here on 0.01% resistors) and we get pretty well
the full 12 bits out of that. I'd like to go to a 16-bit ADC one day
but I am very sure it won't give us more than maybe 2 extra bits that
mean anything...

The delta-sigma ADCs seem a lot easier to use for slow signals, due to
the way they average the signal noise I think (rather than sampling
it). The number of "noise free" bits goes as the square root of the
measurement time or number of samples averaged. I.e., each extra bit
takes 4x the time. You can even get to 24 noise free bits if you don't
mind the measurement taking a few seconds. The linearity can then be
that of the device, typically 1-2ppm for a good one.

TI claim a "32 bit" delta-sigma! :)

<http://www.ti.com/product/ads1282>
 
H

hamilton

Jan 1, 1970
0
I've been doing analog stuff since the 1970s, used to design HV power
supplies stable to a few ppm, but it is still amazing to see what can
be achieved e.g.

http://www.analog.com/en/press-release/12_04_12_ADI_24bit_Sigma_Delta_AD_Converter/press.html

How how can this possibly work? 24 bits of noise-free performance,
never mind linearity, never mind any sort of absolute accuracy, seems
completely unachievable in the analog world.

Where did you read that !?!?

AD7176-2 24-bit Sigma-Delta A/D Converter Key Features:

5 SPS to 250 kSPS output rate for fast and flexible updates

Up to 90 dB 50 Hz and 60 Hz line frequency rejection using enhanced
50 Hz and 60 Hz rejection filters

7.8 mA total current consumption
 
P

Peter

Jan 1, 1970
0
John Devereux said:
A HP 3458A can do this. (Well, absolute accuracy would depend on the
calibration).

Interesting.

I have the 3568 here. It is c. 20-25 years old and checking it against
fresh stuff it seems less than 0.01% out.

But these don't have to read fast. A 1 sec conversion time is fine.

Great engineers in those days... I have the service manual for it too.

And same for the 3314A, though the reed switches keep packing up.
Fantastic circuit design, and the TMS9900 micro :)
 
Absolute accuracy is different story. Calibration could help against
termo couple effects in the short term; but how about the aging of the
board and components.

If anyway there is going to be a multiplexor at the ADC input, why not
reserve one channel for 0 V and an other for Vmax ? This way you can
calibrate the ADC during each scan.
 
J

John Devereux

Jan 1, 1970
0
Peter said:
Interesting.

I have the 3568 here. It is c. 20-25 years old and checking it against
fresh stuff it seems less than 0.01% out.

But these don't have to read fast. A 1 sec conversion time is fine.

Great engineers in those days... I have the service manual for it too.

Yep. In fact I am starting to suspect they can no longer do it. The 25
year old 3458A is still Agilents premier DMM. Their only 8.5 digit one,
and still pretty much the state of the art.
And same for the 3314A, though the reed switches keep packing up.
Fantastic circuit design, and the TMS9900 micro :)

The older 3458A are actually better than the new ones, since the
references are aged, and the drift tends to go as sqrt(t). They still go
for $4000 on ebay, not bad for a 1/4 century old meter.
 
P

Peter

Jan 1, 1970
0
John Devereux said:
Yep. In fact I am starting to suspect they can no longer do it. The 25
year old 3458A is still Agilents premier DMM. Their only 8.5 digit one,
and still pretty much the state of the art.

There are NO analog engineers coming out of anywhere these days.

Engineering/electronics education is crap in the UK and presumably
also crap in the USA. My son was halfway through an electronics course
(2yrs) and they just reached bridge rectifiers, but without a
functional explanation!

All the good analog engineers I know of are in their 50s, plus.

When they retire, it will be fun :) It will be like the 1970s ... if
you are good you can make a good living.
 
M

miso

Jan 1, 1970
0
There are applications where you don't need the result good to the last
bit in terms of a DVM, i.e. a little scaling error and offset are OK.
Rather you want to capture the shape of the signal.

Also note in some applications if you don't have a high dynamic range
ADC, you end up with some sort of programable gain amplifier, which is
probably worse than having the high dynamic range ADC.

They use wide dynamic range ADC is seismic applications. If you are
looking for the time information on the echo return, a little scaling
error is tolerable.

You have heard the expression that it is an analog world. If you take
that one step further, it is an AC world in the sense that time varying
signals are more important generally than DC. Of course there are cases
where DC matters a lot, like in a scale.

When you start to take contact potentials into account, DC accuracy at
high bit levels is really tough too.
 
Top