View Single Post
  #3   Report Post  
Old September 11th 09, 07:07 PM posted to rec.radio.amateur.antenna
Dave Dave is offline
external usenet poster
 
First recorded activity by RadioBanter: Jul 2006
Posts: 797
Default Question on re-radiated field


"Antonio Vernucci" wrote in message
.. .
Can someone please confirm or deny the following arguments.

Let us have:

- a transmitting system operating at any given frequency
- and a metal bar, located far away from the transmitter, whose electrical
length is exactly half wavelength at the operating frequency.

An induced RF current will flow in the bar. Such RF current causes a
re-radiated field which adds up to the field generated by the trasmitter.

Two questions:

- which are the amplitude and phase shift of the re-radiated field with
respect to those of the field generated by the trasmitter? My instinctive
answer would be same amplitude (in absence of ohmic losses) and 180
degrees. The total field (transmitted + re-radiated) at the metal bar
would so be zero.

- how does the total field change moving away from the bar? I would say
that while the field generated by the transmitter varies very slowly with
the distance from the bar (the transmitter is assumed to be very far
away), the re-radiated field varies fast (also because one would initially
be in the near field). In conclusion, the more we move away from the bar,
the lower is the contribution of the re-radiated field to the total field.
That should be the reason why, in a Yagi antenna, a parasitic element
cannot be put too far away from the driven element.

Thanks and 73

Tony I0JX


sounds like you have the right instincts to me.