50 ohm or 75 ohm cable foe dipole?
I was thinking (finally) about simple dipoles and such and have this
question:
I plan on constructing some simple dipoles. For fun, I have been modeling
them using EZNEC. It seems the "common" practice is to feed the antenna with
50 ohm cable. The antenna feed point impedance is usually ~70 Ohms. Would it
make more sense to feed a dipole with 75 ohm TV coax to keep the SWR low on
the cable and have the mismatch at the radio? If I plan on using an antenna
tuner, would this setup be preferred over 50 ohm feed cable? Does it really
make any difference?
Just Wondering,
Tom
|