Maker Pro
Maker Pro

Audio oversampling

G

George

Jan 1, 1970
0
I'm curious about how high sampling rates go nowadays with standard ADC
chips.

I'm starting a project that requires sampling of 15 kHz analog audio and I
need as high a sampling rate as possible without spending lots of money on
the ADC chip.

BTW, this application has nothing to do with audio quality. The digital
audio stream will be used for other purposes. It won't be converted back to
analog. But the sampling rate needs to be as high as possible.

Thanks for any input.
 
I'm curious about how high sampling rates go nowadays with standard ADC
chips.

I'm starting a project that requires sampling of 15 kHz analog audio and I
need as high a sampling rate as possible without spending lots of money on
the ADC chip.

BTW, this application has nothing to do with audio quality. The digital
audio stream will be used for other purposes. It won't be converted back to
analog. But the sampling rate needs to be as high as possible.

Thanks for any input.

It can go into the gigasamples per second. Please be more vague.
 
E

Eeyore

Jan 1, 1970
0
George said:
I'm curious about how high sampling rates go nowadays with standard ADC
chips.

I'm starting a project that requires sampling of 15 kHz analog audio and I
need as high a sampling rate as possible without spending lots of money on
the ADC chip.

BTW, this application has nothing to do with audio quality. The digital
audio stream will be used for other purposes. It won't be converted back to
analog. But the sampling rate needs to be as high as possible.

You mean with standard off-the-shelf audio ADCs ?

They typically operate with 44.1k or 48k clocks but the internal working is
highly oversampled and they also have an internal digital anti-alias filter can
be very flat up to ~ 20kHz.

Graham
 

neon

Oct 21, 2006
1,325
Joined
Oct 21, 2006
Messages
1,325
giga samples per second in your dreams.
 
T

Tim Wescott

Jan 1, 1970
0
I'm curious about how high sampling rates go nowadays with standard ADC
chips.

I'm starting a project that requires sampling of 15 kHz analog audio and
I need as high a sampling rate as possible without spending lots of
money on the ADC chip.

BTW, this application has nothing to do with audio quality. The digital
audio stream will be used for other purposes. It won't be converted
back to analog. But the sampling rate needs to be as high as possible.

Thanks for any input.

Define "ordinary". Figure that you can easily get 100ksps with a 16-bit
SAR ADC (but maybe not 16 _good_ bits), there's plenty of good audio
chips out there, you can go up to insane sampling rates with 8-bit
devices.

Start looking. If you get specific with what you really want, someone
who's recently done a search may suggest some manufacturers.

--
Tim Wescott
Control systems and communications consulting
http://www.wescottdesign.com

Need to learn how to apply control theory in your embedded system?
"Applied Control Theory for Embedded Systems" by Tim Wescott
Elsevier/Newnes, http://www.wescottdesign.com/actfes/actfes.html
 
G

George

Jan 1, 1970
0
OK, let me be more vague ... :eek:)

I'm taking a 15 kHz random audio source (music) and converting it to digital
in an ADC. The streaming output from the ADC is tested on-the-fly against a
specific data pattern of a few (less than a dozen) samples. If there's a
match, the time it occurs is reported. I'm assuming 16-bit accuracy.

This same process is occurring in TWO devices looking at the same analog
source. The devices are operating asynchronously with no common clock,
though the clock frequencies are reasonably close to one another.

Bottom line, the accuracy of the time-of-occurrence measurement in each
device has to be such that the two independent measurements agree closely in
time with one another. This depends on several factors including ADC sample
rate. Less time between samples means less uncertainty in the time that
each device recognizes the desired data pattern in the common source.

I'm looking for <0.5 usec difference in detection times between the devices.
This implies fast sampling, which is the reason for my inquiry about what
sample rates are available in production ADC chips without breaking the
bank.

The digital audio will be thrown away after testing against the desired data
pattern, so there are no issues about quality of reconstruction, noise,
Nyquist limits etc.

Thanks for any help.
 
OK, let me be more vague ... :eek:)

I'm taking a 15 kHz random audio source (music) and converting it to digital
in an ADC. The streaming output from the ADC is tested on-the-fly against a
specific data pattern of a few (less than a dozen) samples. If there's a
match, the time it occurs is reported. I'm assuming 16-bit accuracy.

This same process is occurring in TWO devices looking at the same analog
source. The devices are operating asynchronously with no common clock,
though the clock frequencies are reasonably close to one another.

Bottom line, the accuracy of the time-of-occurrence measurement in each
device has to be such that the two independent measurements agree closely in
time with one another. This depends on several factors including ADC sample
rate. Less time between samples means less uncertainty in the time that
each device recognizes the desired data pattern in the common source.

I'm looking for <0.5 usec difference in detection times between the devices.
This implies fast sampling, which is the reason for my inquiry about what
sample rates are available in production ADC chips without breaking the
bank.

The digital audio will be thrown away after testing against the desired data
pattern, so there are no issues about quality of reconstruction, noise,
Nyquist limits etc.

Thanks for any help.

It sounds like you are going to synchronize these streams by tossing
out samples. [It is a form of phase locking.] So the worse case is the
systems would be sampling half a cycle apart. That would be your
0.25uS, or 4Mhz sample rate. This is the minimum. So a video ADC would
do the trick.

Hopefully I haven't oversimplified the problem.
 
V

Vladimir Vassilevsky

Jan 1, 1970
0
George said:
OK, let me be more vague ... :eek:)

I'm taking a 15 kHz random audio source (music) and converting it to digital
in an ADC. The streaming output from the ADC is tested on-the-fly against a
specific data pattern of a few (less than a dozen) samples. If there's a
match, the time it occurs is reported. I'm assuming 16-bit accuracy.

This is not a good idea for a number of reasons. It is not going to work
this way.
This same process is occurring in TWO devices looking at the same analog
source. The devices are operating asynchronously with no common clock,
though the clock frequencies are reasonably close to one another.

Bottom line, the accuracy of the time-of-occurrence measurement in each
device has to be such that the two independent measurements agree closely in
time with one another. This depends on several factors including ADC sample
rate. Less time between samples means less uncertainty in the time that
each device recognizes the desired data pattern in the common source.
I'm looking for <0.5 usec difference in detection times between the devices.
This implies fast sampling, which is the reason for my inquiry about what
sample rates are available in production ADC chips without breaking the
bank.

That does not imply fast sampling. If your signal is bandlimited to 15kHz,
sampling faster then 30kHz doesn't buy any extra accuracy. BTW, time
measurement of the 15kHz signal to the 0.5uS accuracy may require quite high
SNR and a sophisticated processing.
However if you insist on your mistakes, there is a number of the reasonable
16-bit ADCs which can sample to 1MHz or so. Check the Pilsar line from AD,
and TI/BB too.


Vladimir Vassilevsky
DSP and Mixed Signal Consultant
www.abvolt.com
 
J

joseph2k

Jan 1, 1970
0
George said:
OK, let me be more vague ... :eek:)

I'm taking a 15 kHz random audio source (music) and converting it to
digital
in an ADC. The streaming output from the ADC is tested on-the-fly against
a
specific data pattern of a few (less than a dozen) samples. If there's a
match, the time it occurs is reported. I'm assuming 16-bit accuracy.

This same process is occurring in TWO devices looking at the same analog
source. The devices are operating asynchronously with no common clock,
though the clock frequencies are reasonably close to one another.

Bottom line, the accuracy of the time-of-occurrence measurement in each
device has to be such that the two independent measurements agree closely
in
time with one another. This depends on several factors including ADC
sample
rate. Less time between samples means less uncertainty in the time that
each device recognizes the desired data pattern in the common source.

I'm looking for <0.5 usec difference in detection times between the
devices. This implies fast sampling, which is the reason for my inquiry
about what sample rates are available in production ADC chips without
breaking the bank.

The digital audio will be thrown away after testing against the desired
data pattern, so there are no issues about quality of reconstruction,
noise, Nyquist limits etc.

Thanks for any help.

Less that microsecond match detection? That will nominally require 10 Msps.
Hard to get 16 bits at that speed.

Reasonably priced equipment (under US$ 500) will do 24 bits at 196 ksps,
1/50 of the speed you are talking about.
 
Top