return loss at 0 frequency?
My inset fed patch antenna simulation shows return loss of -12.5dB at 0 frequency. attached pic shows the problem. I have changed almost all parameter(gap, fed width and fed length etc) to bring it up to 0dB but it is not happening.
How can I bring it up,
looking forward for advice
thanks,
ramesh
well, what is your bandwidth at 0 hz?
0Hz ~ DC, ?
FYi your attachment is flagged by the forum admin and we can't open it. You might want to fix it.
Hello,
the picture is attached properly now
Yes, the return loss is high at DC, the bandwidth can be seen from the figure
If somebody had such experiences, please share
thanks
What does return loss at 0 Hz matter at all?
You didn't tell how you achieved the results, we just can assume that the loss curve is plausible according to your antenna design and/or simulation setup.
Every EM simulator has a lowest limit.DC is not included in simulation limits.
It's normal, the simulator made an extrapolation even it's wrong..
thanks bigboss, FvM....
Can we bring this up changing the value of gap between the microstrip and the antenna?
do we have a formula for calculating the gap? mine was a random guess.
thanks
you do not seem to understand that we are making fun of your post. There is not such thing as a "return loss" at 0 frequency.
Initially, I thought it was the gap between the microstrip and the patch because I had put a guess value for the gap as I didn't find any good supporting theory. but as all of you pointed out specially bigboss that the curve at dc seen is not a much of concern i know now this.
I want to get confirmed about the gap, what is the best gap value, if there is any?
thanks