Antenna factor

In electromagnetics, the antenna factor is defined as the ratio of the electric field strength to the voltage V (units: V or µV) induced across the terminals of an antenna. The voltage measured at the output terminals of an antenna is not the actual field intensity due to actual antenna gain, aperture characteristics, and loading effects.[1]

For an electric field antenna, the field strength is in units of V/m or µV/m and the resulting antenna factor AF is in units of 1/m:

If all quantities are expressed logarithmically in decibels instead of SI units, the above equation becomes

For a magnetic field antenna, the field strength is in units of A/m and the resulting antenna factor is in units of A/(Vm). For the relationship between the electric and magnetic fields, see the impedance of free space.

For a 50 Ω load, knowing that PD Ae = Pr = V2/R and E2=377PD, the antenna factor is developed as:

Where

For antennas which are not defined by a physical area, such as monopoles and dipoles consisting of thin rod conductors, the effective length is used to measure the ratio between E and V.

Notes

  1. Electronic Warfare and Radar Systems - Engineering Handbook (4th ed.). US Naval Air Warfare Center Weapons Division. 2013. p. 192.

References

This article is issued from Wikipedia - version of the 9/6/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.