far field distance
To determine the far-field distance one first assume a parallel ray approximation for the fields. Then one do a binomial expansion of the far-field to source point distance. The far-field distance is taken to be that value for which the deviation in the binomial expansion due to neglecting the non-linear term is a 16th of a wavelength. Doing so, one end up with
r_ff = 2D2/λ.
My question is why did we choose a 16th of wavelength, instead of something else?
I have always viewed this as somewhat arbitrary. The 1/16 wavelength error does not disturb the radiation pattern too much and at this distance (or more) one sees a pattern that is close to what one sees at a very large distance.
For some antennas, particularly where low sidelobe level is of interest measurements are made at greater than 2*. One may see 4* or 8* used as the multiplier with a smaller phase error resulting.
Silver has a series of interesting plots showing how a pattern changes with distance. IEEE Antenna Magazine also had an article coupling the distance and sidelobe disturbance. I don't recall the issue but you might want to search on it to see if you can re-locate the reference.
Why not for example 1/15 instead of 1/16? It is not a big thing to know why but still I wonder if there is some motivated reason.
@jone
In my opinion there is no special reason. Broader sence one can choose a distance where the waves becomes spherical plane wave (wave shape does not change with distance but only powers).
I rather recon this distance by checking the Pt/Pr ratio varies linearly.
@Azulykit
I am always bit confused with pahse center concept but one thing i am sure that its rather relative to directive UWB antenna like horn etc and the type of modulation scheme. For example, OFDM can provide an ease to neglect phase center.
