View Single Post
  #2   Report Post  
Old August 10th 03, 11:39 PM
Joel Kolstad
 
Posts: n/a
Default

Roy Lewallen wrote:
Joel Kolstad wrote:
Is this really telling me that if I cut a receiving antenna in the form
of a half-wave dipole at 200MHz, it'll only intercept a quarter as much
power as a half-wave dipole at 100MHz?


Yes. It's half as long, so it won't intercept as much.


Well that certainly seems like something of a raw deal!

I like that way of thinking about it intuitively (it's shorter), although I
believe that -- if you match it properly -- if I take that 200MHz half-wave
dipole and operate it at 100MHz (effectively becoming a full-wave dipole)
it'll actually still collect about the same power as the 100MHz half-wave
dipole. (Since its directivity is still about the same... Looking at it the
other way, for dipoles ~0.2*lambda, the effective area is ~.119*lambda^2,
and it doesn't really matter if it's 0.1*lambda or 0.05*lambda -- although
the later antenna here will be harder to efficienctly match to since it'll
have a noticeably smaller radiation resistance).

I suppose those remote controlled aircraft they build that have the arrays
of dipoles and diodes on their bellies to intercept microwave radiation to
power the motor only use microwaves, then, because they're so much easier to
focus than, e.g., a VHF or lower frequency?

Thermal noise has a uniform frequency distribution, so over a given
bandwidth, the noise power is the same at any frequency. You'll have
better S/N ratio with the 100 MHz dipole than the 200 MHz dipole.


OK, gotcha.

So...in summary... if the cell phone guys had just stuck with AMPS (FM
signaling) when they went from 900MHz to 1.8GHz, the SNR of the incoming
signal would have dropped by 6dB, correct (assuming the antenna was still
the same electrical length)? Bummer...

---Joel