View Single Post
  #8   Report Post  
Old September 12th 05, 01:11 AM
Wayne P. Muckleroy
 
Posts: n/a
Default

The RF power at 2W is about +33 dBm. The cable's connector ends are the most
likely place for RF leakage to occur, as it is with the termination's
connector. Suppose the leakage is 80dB. That would make the radiated power:

P = +33dBm - 80dB = -47dBm
(This is right at the connector site.)

The field intensity goes as the reciprocal of the radius squared. Since -47
dBm is about 20 nW and power in Watts implies MKS units,

P (at 100 feet) = 20 nW divided by (30)squared,

since 100 feet equals approximately 30 meters.

This is about 20 pW or -77dBm.

A good receiver can detect down to -120 dBm (I think).

The field strength of -77 dBm is well within the sensitivity of a good
receiver of -120 dBm.

The short answer is that to detect signals 100 feet away is a likely event
with a typical termination load.

"CD" wrote in message
oups.com...
Hi all,

I'm curious about dummy loads. I read that ideally dummy loads convert
RF energy into heat, but I'm sure in the real world, there will still
be a small amount RF energy transmitted.

My friend tried using a dummy load on a 300W transmitter. He was only
running it for a few mins at the VHF range, but he was picking up a
signal 100 ft away. Is that typical?

What's your experience on dummy loads? Were you able to pick up a
signal, as well, with this type of distance?

I don't know much about dummy loads, but I think that 100ft is just a
tad bit too far, no?