Home |
Search |
Today's Posts |
|
#1
![]() |
|||
|
|||
![]()
In message , Jerry Stuckle
writes So why don't manufacturers design transmitters with 1 ohm output impedance, Rick? They probably would - if they could (at least for some applications). That would then enable you to step up the TX output voltage (using a transformer), so that you could drive more power into a higher (eg 50 ohm) load. But of course, the overall output impedance would then become correspondingly higher. You would also be drawing correspondingly more current from the original 1 ohm source, and if you used too high a step-up, you would risk exceeding the permitted internal power dissipation (and other performance parameters). So yes, you are getting more power output when you match* the source impedance to the load - but it doesn't necessarily mean you always can (or should) go the whole hog. *Or, at least, partially match. -- Ian |
#2
![]() |
|||
|
|||
![]()
Ian Jackson wrote:
In message , Jerry Stuckle writes So why don't manufacturers design transmitters with 1 ohm output impedance, Rick? They probably would - if they could (at least for some applications). That would then enable you to step up the TX output voltage (using a transformer), so that you could drive more power into a higher (eg 50 ohm) load. But of course, the overall output impedance would then become correspondingly higher. You would also be drawing correspondingly more current from the original 1 ohm source, and if you used too high a step-up, you would risk exceeding the permitted internal power dissipation (and other performance parameters). So yes, you are getting more power output when you match* the source impedance to the load - but it doesn't necessarily mean you always can (or should) go the whole hog. *Or, at least, partially match. No, honestly, you're not getting more power output when you match the load to the source. *If* you have a *given* voltage generator with a *given* source impedance, then yes: that situation arises, for instance, if you have a very low noise amplifier with given output characteristics and you want to extract the maximum signal power in order to maintain the best noise factor through stages of amplification. But when you are designing a PA you start with a pile of components (or a catalogue of same) and you choose your voltage swing and current capacity to put as much power in the load as you want to (limited largely by the heat dissipation of the output devices in a practical circuit) and design the circuit to dissipate as little power in the amplifier as you can. You are *not* interested in transferring as much power as you can from a given circuit. It may only be tenth of the power output that you could get (ignoring practical dissipation limits) from a certain voltage with a different load, or a higher voltage with the same load (which you could achieve with a transformer), but that is irrelevant. Among other things, you are likely to end up with a low source impedance compared with the load and that makes no difference to the operation of the transmission line or aerial. -- Roger Hayter |
#3
![]() |
|||
|
|||
![]()
On 7/7/2015 6:17 AM, Ian Jackson wrote:
In message , Jerry Stuckle writes So why don't manufacturers design transmitters with 1 ohm output impedance, Rick? They probably would - if they could (at least for some applications). That would then enable you to step up the TX output voltage (using a transformer), so that you could drive more power into a higher (eg 50 ohm) load. Oh, it's completely possible. It's just a matching network, anyway - one which has to be in place anyway, because the output of a tube amp is relatively high impedance, and the output of a transistor amp is relatively low impedance. In fact, a 144W transistor amp running on 12V wouldn't even need a matching network. It's output would have 1 ohm impedance. But of course, the overall output impedance would then become correspondingly higher. You would also be drawing correspondingly more current from the original 1 ohm source, and if you used too high a step-up, you would risk exceeding the permitted internal power dissipation (and other performance parameters). And why would the output impedance change just because the load impedance changes? They are two separate things. But according to you, no step-up is required - you can drive any (comparatively) high impedance source most efficiently from a low impedance. So yes, you are getting more power output when you match* the source impedance to the load - but it doesn't necessarily mean you always can (or should) go the whole hog. *Or, at least, partially match. So, which is it? First you say the output impedance of the transmitter should be very low for maximum power to the antenna. Now you say it should be matched. Then you say it shouldn't be matched. Which is it? -- ================== Remove the "x" from my email address Jerry, AI0K ================== |
Reply |
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Forum | |||
Vertical Antenna Performance Question | Antenna | |||
Antenna Question: Vertical Whip Vs. Type X | Scanner | |||
Question about 20-meter monoband vertical (kinda long - antenna gurus welcome) | Antenna | |||
Technical Vertical Antenna Question | Shortwave | |||
Short STACKED Vertical {Tri-Band} BroomStick Antenna [Was: Wire ant question] | Shortwave |