Home |
Search |
Today's Posts |
#1
![]() |
|||
|
|||
![]()
If I take an antenna that's resonant at, say, a couple GHz, and operate it
well below that frequency (say, some hundreds of MHz), it's clear that for something simple like a dipole, its radiation pattern is the usual "bagel" shape that an "elemental" (infinitesimally short) dipole would give you. But say I use something like a patch antenna that's designed for 2.4GHz and build enough of a matching network that it presents a 50ohm impedance to, say, a 70cm transmitter. Does the radiation pattern change much? Will it become so lossy (radiation resistance rapidly heading towards zero) that this isn't really a good idea in the first place? (I wouldn't be surprised if a patch antenna actually doesn't radiate much at all outside of the antenna's own resonances...) Or perhaps it's not possible to say, in general, what happens and one needs to perform simulations on a case-by-case basis? I'm asking based on the thought that there are a lot of pretty nice, off-the-shelf antennas out there that were designed to be resonant (using, e.g., quarter-wave dimensions) at some pretty high frequency (2.4GHz being a common one, of course), and I'm interested in how viable it is to use these antenna for 2m/70cm amateur radio use. Thanks for the input, ---Joel |
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Forum | |||
Database of 72 Windom radiation patterns for different antenna heights | Antenna | |||
Database of 72 Windom radiation patterns for different antenna heights | Antenna | |||
Database of 72 Windom radiation patterns for different antenna heights | Antenna | |||
PIC operated FSK modem | General | |||
PIC operated FSK modem | Homebrew |