Tio Pedro wrote:
I've noticed a lot of the early designs from the late 20s
and early 30s used cathode bias (resistors to B- off the
directly heated filaments) on triode RF power
amplifiers. Were they adding a small amount of
bias to make them easier to drive? Or, for what reason?
One other thing, I don't remember seeing parasitic
suppressors on early rigs; did the need become
evident when TV became popular in the late
40s? I know those early TXs could take off in
the nether regions 
Pete
Cathode bias resistors on rf power amps were a safety measure.
If the tube lost drive with no bias it could draw enough plate current
to MELT the plate, especially if run with a high voltage near (or OVER!)
the maximum ratings. Of course, using a C- supply would serve the
same purpose. Many rigs actually used batteries. Since the grid
current flowed in the reverse direction from the battery, a C battery
would actually be RECHARGED in normal use, so they tended to last a long
time.
Parasitic suppressors were not used in the early days since no one was
on the vhf frequencies there wasn't anybody to interfere with!
Actually, parasitic oscillation might show up in other ways making the
amplifier hard to load, and if detected this way the builder would take
steps to stabilize the circuit.