Why the return loss of an antenna decreases at higher frequencies?
Is there any mathematical proof for this?
Could you provide us some more info (frequency range, size and type of antenna, etc)?
@WimRFP
A dual frequency antenna, At first resonance return loss is high and for second resonance return loss is very low.
What you think is the reason behind this?
have you checked the gain(dB) v/s frequency plot? What is the gain at second freq? is it +ve dB or -ve dB? Are you sure that it is a new mode? if gain is also less, then second frequency may be a harmonic.
Just curious, return loss high means big mismatch, return loss low means match well, right? So at different resonant frequencies, the input impedance is different as well. However, if the source impedance is fixed, say 50 or 75 Ohm, then the return loss should be different at different resonant frequencies, am I correct? Please help me understand.
High return loss means good match and lots of loss in the reflected signal.
This is not a general rule. That can happen in your particular design, but it could be otherwise.
Regards
Z
Thank you all for your suggestions. Actually i'm trying to devise some mathematical proof for this condition in my design.
@Zorro: I agree with you. It was the reason for asking more info, but based on the info provided I can't give any useful feedback.