View Single Post
  #28   Report Post  
Old August 12th 03, 04:36 PM
Tom Bruhns
 
Posts: n/a
Default

(Dr. Slick) wrote in message . com...
(Tom Bruhns) wrote in message om...

This could be a whole 'nuther thread. For a reference, I don't have
any difficulty making a 50 ohm load with 40dB return loss out to a GHz
or so for less than $10, and most of that is the connector. Does
someone need to document how to do that? But I submit that if you
want accurate SWR measurements on a particular line, you should
calibrate your SWR meter to that line, and that doesn't take any
reference except the line itself.

Cheers,
Tom



But can you make a 50 Ohm dummy load with those specs that can
handle
300 Watts? The stays at 50 Ohms out to at least 200 Megs or so?


50 +/- what? What return loss are you shooting for in this 300W dummy
load? Do you really need 40dB, or is 30dB good enough? I believe
it's possible to bootstrap yourself into measurements that are far
more accurate than you'll need for what you are doing, and do it quite
economically if you don't count your time. But you ought to first ask
yourself just what accuracy you really need, and understand why.

Probably one can make a quite reasonable broadband power load, at
least to your 200MHz limit. There have been some good construction
articles on making tapered shields for power load resistors, for
example, to get good high frequency performance. But if you can make
just a good low-power one, you can bootstrap your way to accurate
measurements at high power. Use the low power one to insure your
directional coupler is good to some tolerance, and use that to tune
your load at whatever frequency you wish to check. Even your cantenna
should be low enough Q when tuned with an L network that it would be
acceptable over the whole of the 2-meter ham band. It's tedious, but
you can re-tune for any spot frequency inside or outside the ham band.
And given one accurate load, you can determine what impedance your
long, lossy coax is and then use that as a dummy load (probably quite
broadband). For example, if your cantenna is good enough through
30MHz, and pretty good at 54MHz, and under 2:1 SWR at 150MHz, then
perhaps enough RG-58 to give you 10dB loss at 150MHz (150 feet or so),
feeding that cantenna, would work fine from 1MHz to 1GHz. Just beware
of power dissipation in the line itself at higher frequencies. You
can even cascade large coax, small coax and the cantenna, to insure
power handling.

Beware of harmonics messing up your readings!

Also, you'd probably do well to consider how small a change in power
results from a modest load change, for various source impedances.
What's the worst case? What's the best? Is there a reason it might
be nice if an amplifier output was "reasonably close" to 50 ohms, or
doesn't it matter at all? If you think all this through, you may
realize that if your dummy load is even only 20dB return loss, it will
be just fine for the measurements you need to make. But YOU should
convince YOURSELF of that, or of what you really do need.

And i'm still not sure what you mean by "calibrate your SWR meter
to the line". All the SWR meters i have seen are all for 50 Ohms.

Could you tell us the exact procedure?


Of course not; I know nothing about YOUR SWR meter. If you understand
how yours works, you should be able to see how to adjust it, though it
may not be worthwhile. To a close approximation, practically all of
them work by sampling the line current and voltage at a point. The
current is somehow turned into a voltage, and if you adjust either the
voltage sampling percentage or the voltage produced by a given
current, you will have adjusted the calibration impedance. At the
high frequency end, you may need to worry about reactive or
phase-shift effects. If it's not adjustable, don't you at least want
to know WHAT impedance it's calibrated for? Or at VERY least, what it
reads with a 50 ohm load?

Cheers,
Tom