Phase Noise and Output Power Clarification
I would just like to ask if my analysis is correct:
Regarding output power, for Input attenuator value = 20 dB, reference level of 10 dBm, span = 1 MHz, VBW = 1kHz and RBW = 1kHz, if the marker value at the frequency of interest is -5.5 dBm, the actual output power is equal to 4.5 dBm?
Regarding phase noise, for Input attenuator value = 20 dB, reference level of 10 dBm, span = 1 MHz, VBW = 1kHz and RBW = 1kHz, if the marker delta value at the offset frequency of interest is -7.8 dB, the actual phase noise is equal to -37.8 dBc/Hz?
Thanks in advance for the clarification.
The 20 db attenuator is part of the spectrum analyser and is already factored into the +10 dbm reference level. Input atten only effects ultimate sensitivity of spectrum analyser which is typically 15 to 25 db NF with no attenuation.
With 1 kHz RBW there is -30 db to Hz bandwidth. So SBN would be -37.8 dbC/Hz. This assumes there is not a significant slope across the 1 kHz bandwidth. You should have no more then a couple db drop or rise at +/- 1 kHz offset to measurement offset frequency. If you do then you need to use a narrower RBW.
The actual output power is 4.5dBm+CableLoss. And it is not precise to measure power with SA.
Absolute level of carrier is only relevant to max signal handling range of spectrum analyser.
For a delta of -7.8 db from carrier using 1kHz RBW it would be - 37.8 dbC/Hz.
In the first part, Carrier of +4.5 dbm to SBN offset reading of -5.5 dbm in 1 khz RBW would be -10 db or -40 dbC/Hz. The +4.5 dbm carrier will remain at +4.5 dbm regardless of RBW used (assuming you don't have a greater then RBW low freq wobble/drift of carrier). The SBN dbm will drop as RBW is reduced.
Thanks all for your replies. You all have been a big help.
