View Single Post
  #3   Report Post  
Old November 25th 03, 12:58 AM
Dave Shrader
 
Posts: n/a
Default

Ralph Mowery wrote:
I seem to recall reading about a 'standard', it went something
like this,,,, a two meter antenna at 100 feet can "see" or be useable
for 17 miles. I don't recall where I read this,,, but would really
appreciate any and all input on the question,,,,,,,,,,,, How far can a
base two meter radio antenna transmit and recieve so as to be
'useable' when the antenn is 100 feet tall above the earth, and the
surronding area is fairly level. (no hills or mountains). I am
talking about a 50 watt base and 50watt mobil. If there is a formula
somewhere, would appreciate the input.
The reason I ask, is on the way by some very tall tv antennas
1000 and 1200feet, I got to wondering,,, they don't work well with
""my formula"" (17miles=100feet) they (the tv channels #2,#4 #5) are
out of 'gas' at about 70 miles.....?????????? Hope you can 'blumb up
my brain'. thanks in advance. cl.73



A rough rule of thumb is to take the square root of the height in feet and
that will give you the miles from the antenna to the ground. YOu do this
again for the other antenna and add the number of miles. This can be
multiplied by about 1.2 to 1.3 for radio waves. For example if the
transmitter antenna is 625 feet high and the receiving antenna is 16 feet
high. YOu get sqrt 625 = 25 miles, then sqrt 16 = 4 miles. YOu add 25+4 =
29 miles for the visual distance. Then multiply this by 1.3 to get 37.7
miles of radio range.


Yep!! That's the way to do it!!