Why phase noise is generally seen at 100 Hz offset in RADAR applications ?
is there any reason for the measurement at this frequency offset in RADAR application?
Depends what is important in which application. Some time and frequency applications require tight specifications at 1Hz offset.
The 100Hz offset is cruisual for a ground based RADAR application since it must must detect stationary objects also in some cases,
hence the doppler shift for a stationary object nearer to a RADAR is in 30 to 100 Hz range , Hence to detect these reflected waves the phase noise at this point of the Local oscillator needs to be very pure.
you must be talking about weather radars, where very small doppler speeds (i.e. wind speeds) are being measured. Any sort of moving objects will have a much bigger doppler frequency shift, and then the phase noise at 100 Hz is not very important.
Doppler shift is given by:
Doppler = Target Velocity * RADAR frequency / speed of light
For instance, at x-band (10 GHz), 100 Hz doppler is caused by velocity of only 3 m/s (a person walking quickly).
