Home |
Search |
Today's Posts |
#1
![]() |
|||
|
|||
![]()
I note some variation in the use of the term 'Radiation Resistance' (Rr)
that suggests that it has different meanings to different folk. One suggestion is that it is the resistance seen by a transmission line connected to an antenna that expresses its coupling to distant regions of space. If that is the case, Rr would not capture energy that is lost in reflection from real ground. So, Rr would be the sum of power in the far field divided by RMS current squared. If indeed it is the "resistance seen by a transmission line", then the current above would be the current at the end of the transmission line. Does the term have an accepted single clear meaning? Is the above correct? Some implications of the above are that: - Rr of a horizontal half wave dipole with zero conductor loss, above real ground, would have Rr less than R at the feedpoint by virtue of some loss in waves reflected from real ground; - Rr of a half wave folded dipole of equal conductor diameters would be around 300 ohms. Thanks Owen |
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Forum | |||
Radiation Resistance | Antenna | |||
Radiation Resistance & Efficiency | Antenna | |||
Measuring radiation resistance | Homebrew | |||
Measuring radiation resistance | Antenna | |||
Measuring radiation resistance | Homebrew |