微波EDA网,见证研发工程师的成长!
首页 > 研发问答 > 微波和射频技术 > 天线设计和射频技术 > Convert Phase Noise to Time Jitter in sample period

Convert Phase Noise to Time Jitter in sample period

时间:04-09 整理:3721RD 点击:
I have phase noise data in [dBc/Hz] on a number of candidate clocks. I need to convert that into time jitter in a certain sample period. For example, I need to find the rms time jitter during a 10 millisecond sample period. I do not know how to do that!

I can find all sorts of programs out there to convert phase noise into rms time jitter with an infinite sample period, such as at wenzel or raltron's website. But, for a 10 mS sample period, obviously the lower frequency offset phase noise terms can be reduced in the calculation. Phase noise at 100 Hz offset, for instance, is not going to cause that much time jitter if I only look at the clock for a 10 mS sample period of time!

Anyone know of a spread sheet to calculate, or a correction factor to the phase noise data to input into the standard spread sheets?

Thanx

The rms jitter is calculated by integrating the measured phase noise curve over desired frequency band. in sampled systems the upper integration limit should be fs/2, the lower limit depends on your FFT bin size. Usually FFT bin width /2 is a suitable limit for the lower integration limit.

Common SSA:s can measure phase noise curves up to 40MHz offsets. You can of course calculate the rms jitter with these devices. They perfom the integration with no extra effort. (Like Agilent E5052A or R&S FSUP)

Copyright © 2017-2020 微波EDA网 版权所有

网站地图

Top