Help for VCO phase noise simulation after bandpass filter by using functional block
I am simulating a VCO after an ideal bandpass filter. The phase noise of original VCO is about -122 dBc/Hz at 3 GHz.
Then I set bandpass filter by using cadence functional block, Q=50, at 3 GHz. However, the phase noise after filtering
is only -75 dBc/Hz.
I am not very sure where I did wrong.
Hope someone could help me with this problem.
Thanks !
my first guess would be some sort of simulation convergence glitch in your program.
BUT, it IS possible that the bandpass filter is reflecting harmonic energy back to the active device in the VCO and actually degrading the phase noise. Try inserting an ideal 50 ohm line length of various phases in between the VCO and Bandpass filter, and see if the predicted phase noise varies all over the place.
What about the input/output impedance of this ideal filter ?
According to Leeson phase noise equation, doubling the loaded-Q the phase noise gets 6dB better.
Try to use Q of 100 instead of 50, and see what happen. At least you find if the reason is the low Q filter added to the circuit.
Beware that Leeson equation applies only between 1/f flicker corner and the corner where the white (flat) noise starts.