R
RobertMacy
- Jan 1, 1970
- 0
Having a bit of difficulty sorting out the effect on signals when there is
a bit of Clock Jitter in grabbing the Sampling point.
It appears that Clock Jitter DESTROYS the signal!
And, it is NOT a simple relationship. For example, low speed signals are
barely affected, but high speed signals...
And when both are present it's another matter. Which means that clock
jitter relates to the time signal when placing its 'stamp' upon your ADC's
performance.
So how does one determine the clock jitter allowable?
And, what is the minimum clock jitter I could expect in a well-designed
200MS/s system? Is there some way to get better?
a bit of Clock Jitter in grabbing the Sampling point.
It appears that Clock Jitter DESTROYS the signal!
And, it is NOT a simple relationship. For example, low speed signals are
barely affected, but high speed signals...
And when both are present it's another matter. Which means that clock
jitter relates to the time signal when placing its 'stamp' upon your ADC's
performance.
So how does one determine the clock jitter allowable?
And, what is the minimum clock jitter I could expect in a well-designed
200MS/s system? Is there some way to get better?