View Single Post
  #3   Report Post  
Old January 12th 06, 03:30 AM posted to rec.radio.amateur.antenna
Bill Turner
 
Posts: n/a
Default Dipoles and the rig's RF ground...


"Roy Lewallen" wrote in message
...
I disagree.

I'd venture to guess that the average output power of a 100 watt PEP
sideband rig is no more than about 10 watts unless heavy compression is
being used. And I think you'll find that a 10 watt resistor inside a
typical tuner cabinet -- representing loss of all the transmitted power --
will make a barely noticeable difference in the cabinet temperature.


Well for heaven's sake of course! Who would use SSB to make a test like
this? Put it on CW and put a brick on the key. Come back after dinner. Then
you'll know.

Add to that the fact that the thermal time constant of the tuner is
probably longer than the typical transmitting session, so the power needs
to be averaged over the receiving periods, too.


Average, shmaverage. 100% duty cycle during the test.


If you're running a kW, you're up to 100 watts or so during transmit only.
But can you tell the difference with your "calibrated" hand between 1 dB
loss in the tuner (25 watts), 3 dB loss in the tuner (50 watts), or 100%
of the power lost in the tuner (100 watts)?


Between 25, 50 and 100? You betcha, although calibration of hand might have
to be checked.
Remember, we're not talking NIST here, just an idea of what's happening.

I'd love to see the results of a double-blind study where a measured tuner
loss is compared with an estimate made using Bill's method. But I think
the chances of that are about the same as finding a double-blind study of
the audio quality enhancement of $1000 speaker cables.


Bill's method will be less accurate, more cost effective and good enough.
:-)

Bill, W6WRT