No antennae radiate all the power fed to them!
rickman wrote in :
Hmmm... All things emit energy according to their temperature and their
surface emissivity. All things also absorb energy according to their
surface emissivity. Both processes are going on at all times. So an
object loses or gains heat depending on its temperature and the
temperature of the environment. That delta temperature sets the rate
along with the surface emissivity.
Ok, that works for me. I guess the rate of change is exponential just as
energy loss in a fading note from a stretched string, roughly reaching
equilibrium when it can't lose more energy to ambient conditions.
About warming of superconductors out there, I may be wildly underestimating
the effect of a difference of 77K. What's I'd thought of was that if a
supeconductor can only operate at a very low temperature, its thermal
emission will be low; perhaps so low that it might take very little input
(from whatever, I know not what, and especially so if its emissivity is high
making absorbtion easy) to balance that and stop it staying cold enough. My
difficulty comes from not being sure whether a difference of 77K means the
same thing at cryogenic temperatures as it does around room temperature,
because it's not an infinite continuum of temperature. I was thinking that
because it is so cold, that small amounts of heat lost from other equipment,
might find their way to a superconductor and cause bother in the absence of
forced cooling. I can't really imagine any use of superconductors in space
that would not include the risk of local heat sources.
|