how to measure input impedance of a power detector (diode detector) circuit
I got zero bias shottky diode based power detector. It gives DC output upto -30 dBm power input.
I want to measure input impedance of this circuit.
I just connected the input of the circuit to VNA port 1; it shows about -3 dB in magnitude plot.
1. Is the method I used to measure input impedance correct? if not what is the proper method?
2. What can I do to improve the input matching?
If I use an impedance matching cct. will it improve the sensitivity (will it detect signals < -30 dBm) ?
cheers,
per_lube
As you wrote that your detector should operate with -30 dBm input and the VNA generates -3 dBm, you should use a 27 dB attenuator to get the required input power.
Then the detector will be well matched (the VNA will "see" only the attenuator).
Detectors without input attenuators have a variable impedance as a function of input power.
Many detectors are used as "quadratic response" devices with a linear output voltage for input power level below -20 dBm. Their output load should be >10 kOhms. Their frequency response is often slow, below ~10 kHz.
For a fast response, a low-impedance load is needed, and detector sensitivity (K-factor, 2-3 V per 1 mW input, may drop to 0.2 V/mW for <100 Ohm load). But the video-frequency response may then exceed 1 GHz, depending on video-output capacitor value.
A simple diode detector can be and is a quite complex device depending on application.
- Output Impedance Of a Triple cascode
- How to make image impedance equal
- Input and output impedance matching in Distributed amplifier
- Characteristic impedance of combination of CPWG and stripline on inner layers
- Problem of impedance matching of Gilbert cell mixer
- Input impedance of transmission lines connected in cascade
