Patch Efficiency and Substrate Thickness
I would like to clarify a question that has been haunting me lately: what is the relation between substrate thickness and antenna radiation efficiency for a patch antenna designed in a substrate with Dk = 6?
During my simulations I have observed that if I increase the substrate thickness the radiation efficiency increases as well (until the surface wave phenomena kicks in). Could someone please explain me how?
Intuitively I understand that large Dk and thin substrates lead to higher electric field densities at the substrate and lower fringing fields.. however, if the energy that the patch receives is not being radiated, where is it being lost? On the dielectric?
If I have not made my self clear, please warn me.
Thank you for your time.
With best regards,
Rui Gomes
The patch antenna efficiency increase with an increase in the substrate thickness mainly because of the increase in the radiated power.
But after that, the patch antenna efficiency starts decreasing due to the end-fire radiation and of the higher cross-polar level and excitation of the surface wave.
The power loss in the surface waves increases with an increase of the ratio height/λo (normalized thickness) of the substrate.
Thank you very much. But when the substrate thickness is very small, where is the power lost/dissipated since it is not radiated? In the conductor, substrate?
A lot of dissipation of patch antennas is from substrate losses.
You can easily find out for your geometry with two testcases:
1) set conductor loss to zero (only dielectric loss)
2) set dielectric loss to zero (only conductor loss)