Input power at transistor port
This may be a really rookie question. I am designing a mixer, and was wondering how I should choose the input powers in dBm if I know the wanted input voltage magnitude. So the input impedance at the drain or the gate of a FET is not 50ohm, hence I guess I should calculate the apparent power on the input impedance (calculated from S-parameters) by the known voltage. My guess would be that this is correct, even if I match the port to 50ohms. So if the amplitude of the voltage is 1V and the input impedance is (100-j*50)ohms (totally made up value for the question), then the power is (1V)^2 / (2*111.8ohm)=4.47mW->6.5dBm. Am I right, or for example if the amplitude of the voltage is 1V, then the power is (1V)^2 / (2*50ohm) = 10mW->10dBm and it is as simple as that?
Thanks!
Mixers are generally driven hard by LO port and impedance matching is not necessary because input impedance is very variable at LO port due to hard driven rail to rail swing signal level.Matching can be considered for RF/IF port so small signal input impedance can be taken into account.However matching is not a "must" for those ports, except some critical applications.
Very interesting! I did not knew that before (LO port matching). It makes sense.
Use a pad attenuator at the mixer LO port (which most of the RF mixers use) and this will solve all the impedance matching issues at that port.
Thank you! It really helped. But may I ask, why this works so good? As far as I knew a pi- or T-attenuator shows 50ohm impedance at both ports.
However, the original question still stands. If for example I bias the gate of the FET to -1V and want to apply a voltage with a 1V amplitude, how much power in dBm should I apply? Can it be calculated with a varying input impedance, or should I simulate the time-domain voltage function to know?