thejim said:
Suppose we have a raddar that works on Doppler principle.As an object
comes closer to the radar there will be a change in frequency(doppler
shift).What i am asking to know is if the object comes closer at a
greater speed is there going to be a change in the magnitude of the
frequency shift in comparison if it was coming slower or just the the
rate of change of magnitude will be faster.
Thank you.
Doppler shift is dependent on the relative velocity between the target
and the receiver, and the frequency shift is proportional to the
relative velocity.
Doppler shift is given as (for a generating source moving relative to
an independent receiver):
f' = f( 1 / [ 1+ (v / Co)] ) where f' is the received frequency, f the
transmitted frequency, v the relative velocity between source and
receiver and Co the medium velocity. For free space and electromagnetic
radiation, that's 3 X 10^8 m/s to a close approximation.
For a target of a radar gun, the solution is f' = 2{f( 1 / [ 1+ (v /
Co)] )} because the doppler effect occurs both on the outbound and
return trips.
These equations are for targets moving much less than the speed of
light (beyond that relativistic effects become significant).
Note that the relative velocity is |v| x cos theta where |v| is the
absolute velocity of the target + receiver and theta the angle between
the target motion and the direct line between target and receiver. For
a stationary radar gun, it is the velocity of the target times the
cosine of the angle involved.
A rate change of frequency will occur is the target accelerates
relative to the receiver. (Acceleration may be positive or negative)
Cheers
PeteS