leakge current on phase noise
How does the leakage current of an active device (such as CMOS device) affect the phase noises at different slope regions predicted by Leeson's formula?
Since the leakage current is "white-noise" like, I am speculating that it won't change the slopes of 1/f^3, 1/f^2, and flat regions, but only raise up the noise levels uniformally for all these regions. Is this claim correct?
Yes..
Since the white noise has infinite bandwidth, it's normal to raise up the average noise threshold
Thanks for the reply, BigBoss.
There is one thing that I am not so sure: although for sure leakage current will raise up the noise, the upconversion factors may be different for different offset frequency regions, therefore, the resultant noise increase due to this may be different for different offset frequencies.
But on the other hand, if we can assume the upconversion factor is nearly constant across these offset frequency range, then the increae will be the same for all offset frequencies.
This assumption needs to be verified though, but I am not sure how.
Before you generalize about leakage current and broadband phase noise, you need to look at the devices actual physics. If leakage current is there due to defected semiconductor structure, like traps holding and randomly releasing electrons, then it will have a much stronger effect on phase noise.
Thank you for the comments.
I am guessing what you are referring to is flicker noise (1/f noise), which is trap and surface defects related. Usually, it is not called leakage current. The leakage current is more "white noise" like, often related to reverse transportation and recombination for BJTs, and soft pinch-off for FET devices. Correct me please if I understand this wrong.
My friend said it has no effect, which is quite surprising. I just could not agree. Need find proof on this.
