Thread
:
50 Ohms "Real Resistive" impedance a Misnomer?
View Single Post
#
9
July 15th 03, 06:57 PM
Tom Bruhns
Posts: n/a
(Dr. Slick) wrote in message . com...
....
Why do i ask all this? Well, if you believe that complex
impedance measurements (series equivalent) by MFJ antenna analyzers
are not completely inaccurate, then it appears that two 1/4 watt 100
Ohm resistors in parallel (lead lengths short) are a much more
consistent 50 Ohms over the VHF band than almost all the higher power
dummy loads we have tested.
Problem is, the high power dummy loads will vary from 52 to 45
"real" ohms depending on the frequency, with the "real" part of the
impedance getting lower with increasing frequency, so it doesn't seem
to be a "skin effect". The spread gets much worse when you put a 3'
jumper coax in between, and even more worse when you add a power/swr
meter. Then the "real" Ohms will be from 65 to 35 ohms, with the max
and mins not correlating with frequency at all, and the stray
reactances will be much more too, but just as varied with frequency.
So much for "50 ohm" jumper cables! I suppose they are as close as
they can get them for a particular price.
My theory is that the "real" part of the impedance is mainly the
truly resistive 50 ohms of the dummy load at low frequencies around 10
MHz or so...but as you go up in frequency, the parasitics of the dummy
load and the coax jumper cable will cause "radiation" resistance to be
mixed in with this truly real 50 ohms, giving us readings all over the
map.
What do you folks think?
With just a bit of experience in this area, I think you need to find a
way to control your experiments more carefully so you KNOW what's
going on, to a better approximation. I'm quite sure it's possible to
build fairly high power dummy loads--at least high enough power to
handle legal amateur transmitter outputs--that present much better
matches than you describe. Our most usual problem in calibrating
precision instruments at high frequencies is in finding cables which
are really close to 50 ohms and are also practical for routine tests.
Some of the other parts such as power splitters present problems, too.
But we've gotten pretty good at figuring out just where the errors
creep in, through a combination of analysis and experiments. For
sure, our work builds on a tremendous amount of work that has gone on
in the past in the area of precision RF measurements. We currently
work at frequencies from DC to several GHz, and worry in the GHz range
about errors equivalent to an ohm or so...less down at 100MHz and
below.
But for ham applications, do you really care about all this? There's
probably not a need to, if you're just trying to get power to an
antenna, but if you care because you simply want to learn how to build
a precision system and make precision measurements, that's fine too.
You should be able to find lots of info on the web about that...try,
for example, to find Hewlett-Packard/Agilent ap notes on RF
measurements and calibrations.
Cheers,
Tom
Reply With Quote