Measuring antenna loss: Heat balance?
Hi Jim,
Thanks for the thoughts; I hadn't thought of many of the additional loss
mechanisms you mention.
"Jim Lux" wrote in message
...
In the subject case here, think of this: say you had a 2cm diameter copper
bar and you run 100 Amps of DC through it. The current is distributed
evenly, as is the power dissipation. Now run 1 MHz RF through that same bar.
The skin depth is about .065 mm, so virtually ALL the RF current is
contained within a layer less than 1/3 mm thick. That's a very different
heat and thermal distribution (sort of like the difference between putting
that thick steak in the 200F oven and throwing it on the blazing hot grill).
If you're just looking at surface temperature (i.e., with a thermal camera),
will it take more or power at 1MHz to obtain a given surface temperature
increase than at DC?
At DC, since you're heating up the entire bar, and the only way for the heat
to go is up "out" to the surface... I'm thinking... less power is needed for a
given rise?
That would certainly then overestimate antenna efficiency.
---Joel
|