View Single Post
  #3   Report Post  
Old March 3rd 08, 02:57 AM posted to rec.radio.shortwave
Telamon Telamon is offline
external usenet poster
 
First recorded activity by RadioBanter: Jul 2006
Posts: 4,494
Default Matching Coax Impedance: To Receiver or To Antenna ?

In article ,
"Robert11" wrote:

Hello,

Have a Scantenna antenna (receive only) in attic. It came with 75 ohm
RG 6 coax, so presumably the output has a 75 ohm impedance. How this
varies with freq. I have no idea.

Have a new scanner that says to use 50 ohm coax.

From old posts, the consensus seems to be that it doesn't matter if
you use 50 or 75 ohm coax for a run that I have of about 50 feet.

Do you folks agree with this ?

From a somewhat more rigorous and theoretical view, should the coax,
even it hardly matters, be matched to the antenna output impedance,
or the input impedance of the scanner ? Why ? (again, receive only)

Any thoughts on this would be most appreciated.


Chances are I will be the only person to answer you that it matters.
Everyone else is of the opinion that it doesn't. You will lose received
signal power with the mismatch and the argument always revolves around
how much the loss is important to your reception.

Since you have the coax use it and if the performance is satisfactory
then I would not worry about it. If the performance is not good enough
then go buy the right impedance coax and for scanners at higher
frequencies the most important factor is loss per foot. You want the
lowest loss per foot in the frequency range you want to receive on.

For short wave the loss per foot is not as significant being in the 3 to
30MHz range. Coax loss goes up per foot at higher frequencies.

--
Telamon
Ventura, California