Home |
Search |
Today's Posts |
#1
![]() |
|||
|
|||
![]()
OK, I apparently drifted off the beaten path plus I seem to be
experiencing some serious brain fade. What I was doing was preparing a short presentation for new hams on the subject of vertical antennas. I was using EZNEC to produce some antenna pattern graphics. It was then that I noticed that when I overlayed the pattern from a vertical half wavelength dipole with that of a horizontal half wave dipole at the same center height over real ground that the pattern from the vertical was completely enclosed by the horizontal dipole pattern, at least broadside to the horizontal dipole that is. The vertical dipole pattern definitely showed a lower angle of peak radiation but no greater gain a low angles than the horizontal dipole. At first, seeing the vertical dipole gain the same as the horizontal dipole, even at low elevation angles, was a little confusing but I had just scanned a bit of text on vertical antenna operation including calculation for reflected waves and stuff like the pseudo Brewster angle. But then... I remembered talking to a couple guys in Germany on 75 meters the previous evening. From my location here in Missouri, I was hearing their signals on my 75 meter inverted L much stronger at 10 to 20 over S9 than on my dipole at S4 to S5. They noted the same difference in performance between the two antennas. As both the Dipole height and the top of the inverted L were at 50 feet, I thought this was a reasonable comparison. Also, the dipole is in the clear, resonant, and has been performing as well or better than other horizontal dipoles used by other hams in this area. Furthermore, my experience switching between horizontal and vertical antennas on 75 meters matched that of other guys with both. I was starting to wonder why the mismatch between the theory I was familiar with and my experiences. Jumping back in to the text books and spending some time 'googling' for more info I found nothing to conflict with the material I had previously covered. That was disconcerting. About the only glimmer of a solution to the question popped up when I looked at papers on ground or surface wave propagation. There were some vague comments about diffraction that seemed to indicate one of the loss factors involved with ground wave propagation is that some of the signal does not get diffracted low enough to keep in from being 'lost' to sky wave radiation. As I continued chasing that thought, I found that discussions of sky wave propagation ignored ground wave and discussions of ground wave propagation considered sky wave as lost RF. Now, after all that windup, what am I missing? I acknowledge ahead of time that I may be a dummy so don't bother explaining that to me. Why do reasonable size vertical antennas with proper radial systems under them outperform horizontal dipoles for DX operation for typical ham antenna support structure heights of 50 feet or so? The interesting question then: Is the improved performance of vertical antennas over horizontal dipoles on 75 meters at DX distances due to a combination of direct radiation plus radiation from the ground in the area of strong ground wave strength out hundreds of meters? Is the ground wave leakage providing additional low signal strength in both transmit and receive? Gary - N0GW |
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Forum | |||
Antenna Question: Vertical Whip Vs. Type X | Scanner | |||
20 M vertical ground plane antenna performance? | Antenna | |||
Technical Vertical Antenna Question | Shortwave | |||
Short STACKED Vertical {Tri-Band} BroomStick Antenna [Was: Wire ant question] | Shortwave | |||
Poor vertical performance on metal sheet roof - comments? | Antenna |