View Single Post
  #6   Report Post  
Old January 22nd 05, 06:20 PM
Jim
 
Posts: n/a
Default


"Fred Zalupski" none listed wrote in message
...
There is an interesting question in QST this month from a ham that wants
to
put an antenna in the attic and re-use some TV coax already existing. The
answer was that it was probably ok, depending on a few factors like the
feed
point impedence of the antenna, the design and band coverage of the
antenna,
use of a tuner, and transmitter power (cable TV coax may not be suited to
high power). I'd be more inclined to think that dB loss in coax related to
frequency is more critical than 50 ohm vs. 70 ohm mismatch. That is, I
wouldn't use RG58 for UHF applications, but would use RG 6 for HF. This is
a
subject about which I am curious, so I'll be interested in other repsones,
too.

Good DX!



Assuming the antenna has a 50 Ohm feedpoint impedance the use of 75 Ohm
cable will cause a 1.5:1 SWR. This is an acceptable match, if you aren't
transmitting you'll not even notice the mismatch. There was an article
several years back about using "End runs" of hardline from the cable
company's.