Power Detector Working Principle
I am currently trying to put a power detector in my receiver chain to account for temperature compensation. I have some curiosities about how exactly power detector detects power?
Does it take the peak voltage through a diode bridge of some kind or how does it adjusts the bandwidth of power detection? I assume there should be types with digital capabilities and there are more simple ones, any guidance is appreciated.
Best Regards,
ktr
For RF Power Detectors, look at www.analog.com
You'll find lot of valuable information about them..
i agree, the analog devices ones are high quality, and cost effective.
You can build your own, but a good one involves a pair of schottky diodes, one has DC bias and the RF hitting it, the other just the same DC bias. Then you amplify the difference in voltage of the two, and that gives you a pretty good idea of the RF power. Often a thermistor is added to the op amp circuit to further temperature compensate the detector.
In the early days, the very first power detectors were using tubes or crystal (cat-whisker) detectors.
Amazingly, the point-contact diodes (which are the close relatives to cat-whisker detectors) still have SOME better properties than Schottky-barrier power detectors.
They need lower RF input signal before power detection begins (without using DC bias) and have higher detected voltage output than Schottky-barrier diodes.
A good document about all existent power detectors principles is the one from Boonton. Search the net for "Principles of Power Measurement - Boonton"
You can also search RSSI ( Received Signal Strength Indicator) circuit that works with the same principles of Power Detection/Measurement.
The circuit supplies a linear DC voltage regarding to Input Power Level in dBm ( or logarithmic ).