View Single Post
  #1   Report Post  
Old March 2nd 08, 07:36 PM posted to rec.radio.amateur.antenna
Robert11 Robert11 is offline
external usenet poster
 
First recorded activity by RadioBanter: Jun 2006
Posts: 105
Default Matching Coax Impedance: To Receiver or To Antenna ?

Hello,

Have a Scantenna antenna (receive only) in attic.
It came with 75 ohm RG 6 coax, so presumably the output has a 75 ohm
impedance.
How this varies with freq. I have no idea.

Have a new scanner that says to use 50 ohm coax.

From old posts, the consensus seems to be that it doesn't matter if you use
50 or 75 ohm coax for
a run that I have of about 50 feet.

Do you folks agree with this ?

From a somewhat more rigorous and theoretical view, should the coax, even it
hardly matters, be matched to the antenna output impedance, or the input
impedance of the scanner ? Why ? (again, receive only)

Any thoughts on this would be most appreciated.

Thanks,
Bob