View Single Post
  #1   Report Post  
Old March 17th 09, 05:16 PM posted to rec.radio.amateur.antenna
Joel Koltner[_2_] Joel Koltner[_2_] is offline
external usenet poster
 
First recorded activity by RadioBanter: Dec 2007
Posts: 133
Default Receiver antenna/amplifier matching question

I'm trying to figure something out here...

Assume that I have a 2m antenna that (through whatever means) happens to have
an impedance in the ballpark of 50ohms. Also assume that this antenna will
generate 1uV open-circuit.

Assume that I have a MMIC amplifier that has FET inputs and therefore has a
high input impedance... let's say it's 500ohms.

If I directly connect the antenna to the amplifier, the amplifier input
effectively sees ~0.909uV and then amplifies it.

If I build a matching network to make the amplifier "appear" as 50ohms, at the
input of the matcing network I'll have 0.5uV, but since energy has to be
conserved (my matching network is assumed to be lossless),
(0.5uV)^2/50=Vamp^2/500 -- Vamp=1.58uV. This is a gain of
20log(1.56/0.909)=4.8dB over a direct connection.

Is that all correct?

Now let's go down to HF. My antenna is still 50ohms, but at such low
frequencies parasitics aren't that bad and I just use a JFET or MOSFET and
have a 5kohm input impedance. In this case, without the matching network, the
amplifier sees 0.99uV. With the matching network, it sees 5uV (!), a gain of
14.1dB over a direction connection.

Yet many designs for HF/shortwave antenna amplifiers don't bother with a
matching network, just feeding the antenna directly into a FET. So why is
this? Is the extra gain just not needed on HF? Or too unwiedly to build when
you're trying to cover everything from 1MHz-30MHz?

Thanks for the help,
---Joel Koltner