View Single Post
  #2   Report Post  
Old March 2nd 08, 08:16 PM posted to rec.radio.amateur.antenna
Wimpie Wimpie is offline
external usenet poster
 
First recorded activity by RadioBanter: Aug 2006
Posts: 106
Default Matching Coax Impedance: To Receiver or To Antenna ?

On 2 mar, 20:36, "Robert11" wrote:
Hello,

Have a Scantenna antenna (receive only) in attic.
It came with 75 ohm RG 6 coax, so presumably the output has a 75 ohm
impedance.
How this varies with freq. I have no idea.

Have a new scanner that says to use 50 ohm coax.

From old posts, the consensus seems to be that it doesn't matter if you use
50 or 75 ohm coax for
a run that I have of about 50 feet.

Do you folks agree with this ?

From a somewhat more rigorous and theoretical view, should the coax, even it
hardly matters, be matched to the antenna output impedance, or the input
impedance of the scanner ? Why ? (again, receive only)

Any thoughts on this would be most appreciated.

Thanks,
Bob


Hello Bob,

For reception with a scanner, using 75 Ohm instead of 50 Ohm cable
will not make a big difference.

Your Scantenna will also not have a 50 Ohms output impedance all over
the band. It will be higher, lower and/or reactive, depending on the
frequency.

Your scanner will also not present 50 Ohms to the antenna all over the
tuning range. It is even unlikely that your scanner will have maximum
sensitivity when driven from 50 Ohms all over the tuning range. There
will be frequency ranges where maximum sensitivity (input power, not
voltage) will be reached with other then 50 Ohm output impedance of
the source.

When you should do rigorous measurements with your setup (that is with
RG6) with respect to a 50 Ohm setup, there will be frequency ranges
where your setup will give best sensitivity and where the 50 Ohms
setup will give best sensitivity.

So Bob, don't worry when using RG6 75 Ohms coaxial cable.

Best regards,

Wim
PA3DJS'
www.tetech.nl (Dutch).