Home |
Search |
Today's Posts |
#1
![]() |
|||
|
|||
![]()
Hello,
Have a Scantenna antenna (receive only) in attic. It came with 75 ohm RG 6 coax, so presumably the output has a 75 ohm impedance. How this varies with freq. I have no idea. Have a new scanner that says to use 50 ohm coax. From old posts, the consensus seems to be that it doesn't matter if you use 50 or 75 ohm coax for a run that I have of about 50 feet. Do you folks agree with this ? From a somewhat more rigorous and theoretical view, should the coax, even it hardly matters, be matched to the antenna output impedance, or the input impedance of the scanner ? Why ? (again, receive only) Any thoughts on this would be most appreciated. Thanks, Bob |
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Forum | |||
How much can the impedance of coax vary from its characteristic impedance? | Antenna | |||
impedance matching | General | |||
Antenna impedance matching - old Grundig tube receiver | Shortwave | |||
For the Longwire {Random Wire} Antenna to Coax Cable "Connection" - - - Think 'Matching Transformer' ! | Shortwave | |||
SW Receiver - ext. antenna impedance? | Antenna |