View Single Post
  #13   Report Post  
Old January 15th 06, 09:44 PM posted to rec.radio.amateur.antenna
Amos Keag
 
Posts: n/a
Default Homemade Antenna Tower

Tekmanx wrote:
Also, I heard 802.11g sucks outdoors. This true? And would you guys say
my 400mw radio is overkill for 4-10mile shot?


You seem to be getting too much information and not answers to your
questions.

400 milliwatts is plenty of power. The performance of your system will
be governed by two simple factors: how high are the antennas; and, how
sensitive is the receiver?

For ten miles your antenna height above average ground should be close
to 100 feet. For seventeen miles you need an antenna height of close to
300 feet. I suspect both heights are excessive for your application.
Also, the cables connecting your transmitter to the antenna have losses.
So, mounting the transmitter at the top of the antenna would be
preferred. An alternative to the 100 foot antenna would be 70 feet
antennas, one at the transmitter and one at the receiver.

Receiver sensitivity is unknown. I regularly communicate 20+ miles with
500 milliwatts from a ham radio walkie talkie. [the receiver is located
on top of a mountain] Most communiation grade radios can receive a
signal as small as 0.000000000000002 watts. [one millionth of a volt].
So, you can see why I say 400 milliwatts is plenty. So, the receiver you
use should have a moderately good sensitivity 5 microvolts or smaller.

Over relatively flat terrain, with very modest antenna [20 feet high] a
four mile circuit should be possible. Longer paths will require spending
$$$.