Peter wrote:
In the electric power industry there is increasing public concern regarding
fields around power lines. The general public often refers to these fields
as electromagnetic radiation, which it is not. We generally are concerned
with magnetic fields and less often with the electric charge fields. The two
are treated as separate issues.
I believe the same is true of antennas, that is the electric and magnetic
fields are separate when you are close in to the antenna in terms of
wave-length.
Question:
If this assumption is correct at what point or distance do the two
relatively independent fields become the one all important electromagnetic
wave?
by "electromagnetic wave" you really mean freely propagating wave... and
by definition it's at the point where the ratio of E/H = 377 ohms, aka
"the far field".
Closer in, the ratio between E and H may not be 377 ohms, and energy is
moving back and forth between the fields and the antenna (or between the
E and H fields, depending on your conceptual model).. a region also
called the "reactive near field" or "near field"..
Lots of folks also talk about a "transition zone" where there's both
significant propagating wave and stored energy.
None of these boundaries are hard and fast.. there's some conventions
used in connection with things like antenna ranges. The conventions are
derived from an underlying assumption that the real behavior isn't
significantly different (in a measurement sense) from what you measure.
e.g. the "far field" assumption for antenna range distance
(2*D^2/lambda) is where the deviation of the spherical wavefront from
the assumed perfectly flat is small enough that the error in the
boresight gain measurement is "small" compared to other effects. (It's
comparable to the Rayleigh criterion for optical reflectors)
Peter VK6YSF
http://members.optushome.com.au/vk6ysf/vk6ysf/main.htm