Home |
Search |
Today's Posts |
#3
![]() |
|||
|
|||
![]()
In article ,
dave wrote: Robert11 wrote: Hello, Have a Scantenna antenna (receive only) in attic. It came with 75 ohm RG 6 coax, so presumably the output has a 75 ohm impedance. How this varies with freq. I have no idea. Have a new scanner that says to use 50 ohm coax. From old posts, the consensus seems to be that it doesn't matter if you use 50 or 75 ohm coax for a run that I have of about 50 feet. Do you folks agree with this ? From a somewhat more rigorous and theoretical view, should the coax, even it hardly matters, be matched to the antenna output impedance, or the input impedance of the scanner ? Why ? (again, receive only) Any thoughts on this would be most appreciated. Thanks, Bob The mismatch is insignificant. About 1.3 dB of loss or something. That is a simplistic view. -- Telamon Ventura, California |
Thread Tools | Search this Thread |
Display Modes | |
|
|