Home |
Search |
Today's Posts |
#11
![]() |
|||
|
|||
![]()
In message , Jerry Stuckle
writes So why don't manufacturers design transmitters with 1 ohm output impedance, Rick? They probably would - if they could (at least for some applications). That would then enable you to step up the TX output voltage (using a transformer), so that you could drive more power into a higher (eg 50 ohm) load. But of course, the overall output impedance would then become correspondingly higher. You would also be drawing correspondingly more current from the original 1 ohm source, and if you used too high a step-up, you would risk exceeding the permitted internal power dissipation (and other performance parameters). So yes, you are getting more power output when you match* the source impedance to the load - but it doesn't necessarily mean you always can (or should) go the whole hog. *Or, at least, partially match. -- Ian |
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Forum | |||
Vertical Antenna Performance Question | Antenna | |||
Antenna Question: Vertical Whip Vs. Type X | Scanner | |||
Question about 20-meter monoband vertical (kinda long - antenna gurus welcome) | Antenna | |||
Technical Vertical Antenna Question | Shortwave | |||
Short STACKED Vertical {Tri-Band} BroomStick Antenna [Was: Wire ant question] | Shortwave |