received power
Thanks
It is possible. This is the reason of using high gain antennas, to minimize the path loss.
Now:
-30dBi antenna gain can get usually with a dish antenna (e.g. 1.8m at 2.4GHz).
-20dB free-space path loss for example at 2.4GHz is about 10cm.
What is the reason to place two 1.8m dishes, face-to-face at 10cm distance?
Thanks Vfone. It's not a practical problem. I just don't think I can receive 40dBm power from 0 dBm source using passive component (antenna). Maybe I'm wrong?
It is normal to wonder how power can get so high using only the gain of the antennas.
This is the reason why a lot of RF certification specs are placing stringent requirements not only on the maximum conducted output power of the transmitter, but also on the maximum antenna gain of the system.
By conservation of energy law, it's of course impossible, to receive a higher power from an antenna than supplying to it by the sender. Would be a nice perpetuum mobile otherwise.:D One possible systematic error could be in ignoring near field conditions.
If the antenna was amplifying the signal according to the antenna gain then yes it could receive more power, but common sense tells you that it doesn't seem right. The antenna itself cannot amplify a signal, all it can do it try to direct most of the energy in a specific direction. An isotropic antenna is a theoritical antenna that can radiate power equally in all directions. The gain of an antenna is the maximum amount of power it can radiate in specific direction compared to the power an isotropic antenna can radiate in any given direction (since they're all equal).
Your calculations forgot to include the free space loss which is considerably more than the gain of the antenna and is proportional to r^2, r being the distance between both antennas.
Look up or google Friis Transmission equation for more info on the subject.
I hope this helps.
The above discussion claimed to consider frees-space loss, if you read it thoroughly, it's designated
PL (path loss). I didn't check the calculation, but the assumed distance is far below 10λ, so this can be
reason enough for a wrong result.
The initial post didn’t specify any working frequency or distance between antennas.
Myself I gave a possible example (at 2.4GHz), which certainly is unrealistic, mainly due to radiating near field region R = (2*D2)/λ.
You are not wrong in the calculations.
The wrong guess is "path loss is 20dB". In practical it is much more.
[/quote]
As was stated earlier one can never get more power out of a passive device than one puts in.
A pair of high gain antennas can make a very efficient transmission line. Look up "beam waveguide".
I like to think of this from the viewpoint that the full gain of an antenna is not realized until one is somewhat distant from the particular antenna. Once the pattern has solidified the space loss is greater than the gain.
Another interesting line of study is a comparison to cable or waveguide transmission line loss vs. antennas over long distances. It turns out that the antenna based channels are less lossy.
Thanks for all posts. I think I understand now that the condition above cannot happen in real life. So I can l use the equation confidently for my real problem.
