Reply
 
LinkBack Thread Tools Search this Thread Display Modes
  #1   Report Post  
Old November 8th 05, 11:19 PM
Doug McLaren
 
Posts: n/a
Default Question about free space loss ...

I've got what's probably a pretty basic question about free space
loss. I suspect I already know the answer, but I'd like to verify it.

The general formula for free space loss is given as --

36.6 + 20LogD + 20LogF, where D is the distance in miles and F is
the frequency in MHz.

Obviously the units don't matter so much -- if you were to change the
units, you'd just change the 36.6 value to something else. No problem.

The distance component is obvious -- the power is spreading out.

But what's the frequency component about? Why would increasing the
frequency cause less power to be radiated?

My guess is that the frequency factor is _strictly_ something to take
into account the effective area/aperature of the receiving antenna,
which gets smaller as the frequency increases?

As a more real-world example, if you had a 10 foot diameter satellite
dish, aimed directly at a satellite broadcasting (in all directions --
let's assume an isotropic antenna there) 10 watts at 2 GHz, it would
receive (ignoring any losses due to the atmosphere) the same power
from the satellite if the satellite was broadcasting 10 watts at 200
GHz?

(Our 10 foot dish is most definately not an isotropic antenna.)

As a reference, I'm looking at
http://people.deas.harvard.edu/~jone...ropagation.htm
and I'm just verifying my conclusions ...

To use the knowledge I think I've learned to solve the problem I'm
trying to figure out, if you've got something that transmits a certain
amount of data per second at 50 MHz to a receiver that's 1 mile away,
and you wanted to switch to a system that works at 5 GHz but uses the
same bandwidth, blindly applying the FSL formula tells you that you'd
need 40 dB more power, 10,000 times as much.

(Let's assume that both antennas are both 1/2 wavelength dipoles of
the appropriate length for both 50 MHz and 5 GHz -- not isotropic, but
relatively close. My frequencies are arbitrary here, chosen just to
have a nice ratio of 100.)

But really, since the antenna receives less energy (due to it's
smaller size, and it's intercepting less of the signal), the
background noise received by it will be similarly reduced (let's also
assume that the background noise intensity is the same at the two
frequencies) and so it's just a matter of using a more sensitive
receiver, at least until you get to the point where the background
noise isn't the limiting factor, and instead the limiting factor is
the noise generated by the receiver itself? That assuming all my
assumptions are correct (and there's a lot of them, I know) there's
little need to increase the transmitted power at all -- just increase
the gain in the amplifier in the receiver?

Am I missing something?

AD5RH

--
Doug McLaren, One dollar, one vote.
  #2   Report Post  
Old November 9th 05, 02:09 AM
Tom
 
Posts: n/a
Default Question about free space loss ...

Yes - the common path loss formula includes the antenna gain.

The gain of an antenna is defined as 4*PI*Ae/lambda^2.

The effective aperture (Ae) of the antenna has units of square meters,
the units of lambda^2 are square meters, so the wavelength dependence
just normalizes area in terms of the wavelength. Otherwise the
definition of antenna gain would change with frequency. That is, a
dipole's gain would double everytime the freuqency was doubled, but
that's not how we define antenna gain. We define the gain of a dipole
to be independent of frequency.

Path loss depends on antenna gain, 1/lambda^2 can be
translated to f^2/c^2, thus the dependency of path loss on f.

-- Tom






"Doug McLaren" wrote in message
...
I've got what's probably a pretty basic question about free space
loss. I suspect I already know the answer, but I'd like to verify it.

The general formula for free space loss is given as --

36.6 + 20LogD + 20LogF, where D is the distance in miles and F is
the frequency in MHz.

Obviously the units don't matter so much -- if you were to change the
units, you'd just change the 36.6 value to something else. No problem.

The distance component is obvious -- the power is spreading out.

But what's the frequency component about? Why would increasing the
frequency cause less power to be radiated?

My guess is that the frequency factor is _strictly_ something to take
into account the effective area/aperature of the receiving antenna,
which gets smaller as the frequency increases?

As a more real-world example, if you had a 10 foot diameter satellite
dish, aimed directly at a satellite broadcasting (in all directions --
let's assume an isotropic antenna there) 10 watts at 2 GHz, it would
receive (ignoring any losses due to the atmosphere) the same power
from the satellite if the satellite was broadcasting 10 watts at 200
GHz?

(Our 10 foot dish is most definately not an isotropic antenna.)

As a reference, I'm looking at
http://people.deas.harvard.edu/~jone...ropagation.htm
and I'm just verifying my conclusions ...

To use the knowledge I think I've learned to solve the problem I'm
trying to figure out, if you've got something that transmits a certain
amount of data per second at 50 MHz to a receiver that's 1 mile away,
and you wanted to switch to a system that works at 5 GHz but uses the
same bandwidth, blindly applying the FSL formula tells you that you'd
need 40 dB more power, 10,000 times as much.

(Let's assume that both antennas are both 1/2 wavelength dipoles of
the appropriate length for both 50 MHz and 5 GHz -- not isotropic, but
relatively close. My frequencies are arbitrary here, chosen just to
have a nice ratio of 100.)

But really, since the antenna receives less energy (due to it's
smaller size, and it's intercepting less of the signal), the
background noise received by it will be similarly reduced (let's also
assume that the background noise intensity is the same at the two
frequencies) and so it's just a matter of using a more sensitive
receiver, at least until you get to the point where the background
noise isn't the limiting factor, and instead the limiting factor is
the noise generated by the receiver itself? That assuming all my
assumptions are correct (and there's a lot of them, I know) there's
little need to increase the transmitted power at all -- just increase
the gain in the amplifier in the receiver?

Am I missing something?

AD5RH

--
Doug McLaren, One dollar, one
vote.



Reply
Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules

Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Supporting theory that Antennas "Match" to 377 Ohms (Free space) Dr. Slick Antenna 183 October 2nd 20 10:44 AM
Radio Free Colorado - A Successful New Internet Radio Station Gary Burke Broadcasting 0 April 6th 05 04:01 AM
Comet B-10 VHF Antenna Question Ed Antenna 6 October 21st 03 04:40 AM
QST Article: An Easy to Build, Dual-Band Collinear Antenna Serge Stroobandt, ON4BAA Antenna 12 October 16th 03 07:44 PM
The two sorts of loss Reg Edwards Antenna 10 August 21st 03 07:41 PM


All times are GMT +1. The time now is 04:37 PM.

Powered by vBulletin® Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 RadioBanter.
The comments are property of their posters.
 

About Us

"It's about Radio"

 

Copyright © 2017