View Single Post
  #1   Report Post  
Old March 30th 10, 11:43 PM posted to rec.radio.amateur.antenna
Joel Koltner[_2_] Joel Koltner[_2_] is offline
external usenet poster
 
First recorded activity by RadioBanter: Dec 2007
Posts: 133
Default Radiation patterns and loss of antennas operated well below resonance

If I take an antenna that's resonant at, say, a couple GHz, and operate it
well below that frequency (say, some hundreds of MHz), it's clear that for
something simple like a dipole, its radiation pattern is the usual "bagel"
shape that an "elemental" (infinitesimally short) dipole would give you.

But say I use something like a patch antenna that's designed for 2.4GHz and
build enough of a matching network that it presents a 50ohm impedance to, say,
a 70cm transmitter. Does the radiation pattern change much? Will it become
so lossy (radiation resistance rapidly heading towards zero) that this isn't
really a good idea in the first place? (I wouldn't be surprised if a patch
antenna actually doesn't radiate much at all outside of the antenna's own
resonances...) Or perhaps it's not possible to say, in general, what happens
and one needs to perform simulations on a case-by-case basis?

I'm asking based on the thought that there are a lot of pretty nice,
off-the-shelf antennas out there that were designed to be resonant (using,
e.g., quarter-wave dimensions) at some pretty high frequency (2.4GHz being a
common one, of course), and I'm interested in how viable it is to use these
antenna for 2m/70cm amateur radio use.

Thanks for the input,
---Joel