Short antenna = reduced power
"gareth" wrote in news:m1g1n8$39o$1@dont-
email.me:
(2*PI*L/LAMBDA)**2
where L is the antenna length and LAMBDA the wavelength,
thereby showing that the radiated power decreases when the
antenna length decreases.
Ok, but again, doesn't this just mean the system, as in taking into account
feeding it? I'm not up to the maths of it, I'm just imaging a kind of logical
extreme where you have a tiddly bit of wire stub out of the end of a coax
instead of a 9m tall vertical whip. It seems obvious to me that to get the
same efficiency, same power, you have a vastly increased energy density, so
even without the maths I have no problem seeing the relevance of comments
like Jim's (Jeff's?) allusion to room temperature superconductors and such.
In other words, any actual reduction is based on practical limits, not theory
itself. It's not so different with laser diodes, in that a diffraction
limited spot may be obtained easily with a simple aspheric lens from any size
apeture so long as it's a signle lattidutinal mode emitter, but try actually
MAKING an emitter that size. Theory says sure, no problem, energy density and
nature of materials says otherwise.
|