Dr. Slick wrote:
Roy Lewallen wrote in message
...
Here's the problem with that transformer concept again. A field is not a
voltage. So you can't measure it with a voltmeter. You can convert the
fields to voltages and currents by use of a transducer -- an antenna --
then you can measure the voltage and current from the antenna with
ordinary meters.
I agree with you that the field is first converted by the antenna
before it can be measured.
But by definition, the E field is definitely related to voltage
potential.
Well, yes, speed (meters per second) is related to distance. Force
(Newtons) is related to work (Newton-meters). But speed isn't distance,
and force isn't work. The mass of the Earth is related to its orbital
velocity, and mass certainly isn't velocity. Worse yet, the impedance of
free space isn't a measure of the same thing as the characteristic
impedance of a transmission line. What I'm trying to illustrate is that
because two things are related doesn't make them the same thing, or
necessarily even close to the same thing.
Hugh Skillings' Fund. of Electric Waves: "Voltage from point 1 to
point 2 is the line integral of the electric field along any path from
point 1 to point 2. This is the amount by which point 1 is at a
higher potential than point 2."
Yes, that relates voltage and electric field. Don't overlook the bit
about the path, though.
Say PD = E^2/Z0 = H^2 * Z0. If you say the Power Density =
V^2/(R*m^2), and the R=Zo, then these will cancel, giving you E =
V/meter, which are the correct units. So here we are equating the
impedance of free space will a resistive impedance or load.
No, all you're doing is showing that they have the same dimensions. It
just doesn't seem to be sinking in that having the same dimensions
doesn't make two quantities the same thing. I've tried with the example
of torque and work, but that doesn't seem to be having any effect. Maybe
someone else can present some other examples, and maybe, just maybe,
with enough examples the concept will sink in.
Roy, what do you think 1uV/meter really means in terms of how you
measure it? I mean, under what conditions must you have to measure
this 1uV/meter?
As I mentioned before, it's usually measured with a short probe.
But electric field is actually defined in terms of the force on a
charge. You'll find an explanation in any basic physics text, as well as
many places on the Web. In Weidner and Sells, _Elementary Classical
Physics_, Vol. 2, the authors define electric field E as F/q, or the
force that would be exerted on a (sufficiently small) charge at the
point at which the field is being measured. They explain that the units
of electric field are newtons per coulomb which, it turns out, has the
same dimensions as volts per meter. So to your argument that electric
field is "related" to voltage, it's equally related to distance, force,
and charge. You can, in fact, find a bundle of other equivalent products
and quotients of units that are equivalent.
I'm starting to think that what this really means, is that an
exploring particle with a unit positive charge, when placed in a
electric field of 1uV/meter, will experience a change of voltage
potential of +1uV when it is moved directly towards an isotropic
radiator ("the potential of a point in space is the work required to
move to that point a unit positive charge, starting an infinite
distance away...potential increases as one positive charge is moved
closer to another positive charge" - Skilling).
Here we are again. Potential and voltage have the same dimensions, but
aren't necessarily equal. And as far as I can tell, "voltage potential"
is meaningless. To quote from Holt, _Electromagnetic Fields and Waves_,
"When the electromagnetic fields are static, as we shall see, the
voltage drop along a path equals the potential drop between the end
points of the path. Furthermore, these quantities [voltage and electric
potential] are also equal in *idealized* electric circuit diagrams, and
they are approximately equal in physical circuits, provided voltmeter
leads do not encircle appreciable time-changing magnetic flux." Pay
particular attention to the last qualification. When a time-changing
magnetic field is present, the voltage drop between two points depends
on the path taken, while the potential drop is simply the difference in
potential between the two points. So the voltage between two points in
an electromagnetic field can be just about anything you'd like it to be.
Good thing, too. Otherwise we'd all get electrocuted by the Earth's 100
volt/meter field. (And that's on a day with no storm nearby.)
But that's a static field, so we don't have to worry about
touching metallic objects that aren't grounded.
Sorry, I don't see how that's relevant. If field and voltage were the
same, we'd be in trouble, static or not.
Perhaps the far-field measurements would require too sensitive a
field-strength meter? Or maybe it's just more convenient to measure
up close.
No, it's far field measurements that are more common. One problem with
making near field measurements is that the near field varies all over
the map with the type of antenna and the exact spot where you're making
the measurement. And it's of no importance at all to anything very far
away at all. I've only seen near field probing done to locate the source
of a problem emission. Compliance measurements are usually done with
far-field techniques, in or at least at the fringes of the far field.
The "within the near field" measurements I'm referring to are HF
measurements done at distances that aren't firmly in the far field. (The
far field boundary depends on the nature of the radiating structure, and
is nebulous anyway.) The FCC addresses this issue for Part 15 somewhat
in section 15.31(f).
I was going to ask you to define "far-field", and i thought maybe
people defined this as a number of wave-legnths away, but if it's
nebulous like a lot of RF topics, then i would certainly understand.
Actually, "far field" is often defined as a distance of 2L^2/lambda,
where L is the length of the antenna and lambda the wavelength, although
the authors generally admit to its being quite arbitrary (cf. Kraus,
_Antennas_). In Jordan and Balmain's _Electromagnetic Waves and
Radiating Systems_, they explain that the far field approximation is
valid at distances large compared to a wavelength and to the largest
dimension of the source. This is a somewhat conservative definition.
True far field wave characteristics occur only at an infinite distance
from a source, and how close you can come and still have the
characteristics be close enough to true far field depends on the
application as well as the antenna. So there's no strict definition. For
a lot of antennas and applications, field characteristics are close
enough to being far field at a distance of well under a wavelength. For
others, many wavelengths are required. Ian posted a good summary of
salient far field characteristics several days ago.
RF isn't any more nebulous than any other aspect of engineering.
Engineering is a practical discipline, so compromises and trade-offs are
universally necessary. Because we deal with real, physical objects and
are stuck with real measurements, the absolute precision of mathematics
and the pure sciences is never attainable. This is as true for using an
I-beam as it is for RF design. But the principles of RF are at least as
well known as the properties of I-beams. In fact, a good argument could
be made that RF is better known.
Roy Lewallen, W7EL
|