View Single Post
  #3   Report Post  
Old March 3rd 08, 02:58 AM posted to rec.radio.shortwave
Telamon Telamon is offline
external usenet poster
 
First recorded activity by RadioBanter: Jul 2006
Posts: 4,494
Default Matching Coax Impedance: To Receiver or To Antenna ?

In article ,
dave wrote:

Robert11 wrote:
Hello,

Have a Scantenna antenna (receive only) in attic.
It came with 75 ohm RG 6 coax, so presumably the output has a 75 ohm
impedance.
How this varies with freq. I have no idea.

Have a new scanner that says to use 50 ohm coax.

From old posts, the consensus seems to be that it doesn't matter if you use
50 or 75 ohm coax for
a run that I have of about 50 feet.

Do you folks agree with this ?

From a somewhat more rigorous and theoretical view, should the coax, even
it
hardly matters, be matched to the antenna output impedance, or the input
impedance of the scanner ? Why ? (again, receive only)

Any thoughts on this would be most appreciated.

Thanks,
Bob


The mismatch is insignificant. About 1.3 dB of loss or something.


That is a simplistic view.

--
Telamon
Ventura, California