View Single Post
  #10   Report Post  
Old August 10th 05, 10:12 PM
 
Posts: n/a
Default

From: John Smith on Tues 9 Aug 2005 22:19

Len:

I must admit, I am not aware of any of that, attempts to use HF on
power lines, or even VHF... but I have not kept up at all... that doesn't
sound wacky, it sounds impossible to me...

However, I would't even attempt to get a 1.750Mhz signal down power
wiring, the capacitance between windings, shielding in all those xfrms
along the way, underground power lines, ground shielding in between
windings, wiring wound around in conduit boxes, etc, etc...


I can understand all of that. However, the nature of
transmission lines is that they appear (to source and
termination) as extremely-long, quite-low-cutoff-rate
L-C lowpass filters. If the source output impedance and
the termination input impedance are the same as the
transmission lines' characteristic impedance, maximum
power flow happens.

The MAJOR problem with ANY transmission line environment is
DISCONTINUITIES...anything along the line path that upsets
the characteristic impedance, usually a physical size thing
but can be a change in conductor material, addition of a
dielectric in the Transverse ElectroMagnetic (TEM) field,
and similar. ANY discontinuity will produce a reflection
to the wavefront and increases the VSWR (Voltage Standing
Wave Ratio).

Note: I use "VSWR" where most hams use "SWR." That's
industry practice and relates to the method of measurement,
by voltage instead of some means of measuring power (for
direct SWR). All ham-used "SWR" meters really measure
RF voltage in/out, thus they measure VSWR, but indicators
are automatically converting the VSWR to "SWR" for some
odd kind of apparent tradition.

My first degree was in EE. From what I remember, take a damn idiot to
expect those freqs to go any distance at all--the capacitive loading is
going to start looking like a direct short to ground I would expect!
Especially at 80Mhz! And that, even if the modem puts out a 1KW output!
There are some remote 60Hz users out there. The inductance of that wiring
is going to look staggering to multi-Mhz signals, I would think--no one is
going to be able to control the impedance of that feedline. Really, I
would have to see it to believe it, will keep my eyes open, now you have
me interested.


Well, I "cheated" in that I took two semesters (at night)
of microwave theory and techniques because that was what
I was working in at the time. Taught by an EE with teaching
credentials, himself a day worker in microwave engineering
at Hughes Aircraft, Culver City, CA. Gave me terrific
insight to the behavior of transmission lines, matching, etc.
Afterwards I found a McGraw-Hill Schaum's Outline Series on
transmission lines by Robert A Chipman (professor of EE at
University of Toledo) with many fascinating examples and
solved problems on everything from electric power transmission
at 60 Hz to waveguide power transmission at GigaHertzes.
[out of print now but the 230+ pages of 8 1/2 x 11 inch soft-
bound cost a mere $4.95 in the early 1970s] Made good use of
that Chipman text in later formal EE class work...saved
my mind from having to "re-invent the wheel" (as most beginners
do) in order to get the beloved academic *credits*. :-(

Most residential areas in the USA have above-ground electric
power distribution. With those, it is relatively easy to just
look and estimate the conductor size, their spacing, and get
a rough idea of the characteristic impedance of two-wire open
wire transmission lines. Then look along the 4 KV distribution
route and see where the spacings change, the connections to
step-down transformers occur (and at what intervals), the
cross-overs and half-loop jumpers as the line has to turn
corners, splices, whatever. Usually there will be places
where the line spacings deliberately come closer or spread
for whatever mechanical reasons. All are discontinuities.

At RF, most pole transformers present a high impedance to the
4 KV distribution line. Those can be thought of as "parallel
bridgings" (as in video signal distribution) or like the
common "capacitance tap" of older TV cable coax distribution.
They don't affect the main transmission line much at all.
From what I understand of BPL practice, the broadband
service couplers go around the pole transformer and go directly
to the subscriber drop with appropriate HV protection, etc.
How they do that is irrelevant...like the "capacity tap" of
TV cable subscriber drop pick-off, it won't affect the main
distribution route. ALL THE OTHER DISCONTINUITIES WILL affect
the "characteristic impedance" of this BPL transmission line.

Where discontinuities exist physically, there WILL BE RADIATION
of the BPL signal sidebands. That will happen on every single
RF transmission line ever built/installed/debugged/whatever.

Now, 300Hz to vlf is great, and there would be tolerable line
attenuation due to impedance from line inductance, the resistance of the
wire would then become one of largest losses, if not the largest. In
special cases, where line length ended up being a resonate or
near-resonate length, might even have a signal in need of attenuation at
the ISP. I have no idea what-so-ever of how "long wire antennas" of that
magnitude behave like... and as a transmission line! Krist, I am worried
about how much signal I am getting though 250+ feet of aging coax!


Whether coaxial cable or open-wire line, ALL transmission lines
have attenuation increasing with increasing frequency. For new
cable the "db per 100 foot" column can be consulted for what it
was when new in your 250-foot case. Attenuation will increase
somewhat with aging but - for coax - that is due mostly to the
dielectric polymer material doing some weird polymerization
depending on its original quality and composition. Big difference
in polyethylenes, very little difference with tetrafluoroethylenes
(Teflon). Open-wire lines have the least problems with aging,
that due mostly to conductor surface oxidation (plating with
another, non-oxidizing metal helps that) and accumulation of
airborne dirt and crap (which can be cleaned off). Problem with
open-wire line is the characteristic impedance is so much higher
than coax and thus the RF voltages are proportionally higher.

For ANY transmission line ya gotta treat the line as essentially
broadband media (but with a slow rate of attenuation at higher
frequencies). When the line's characteristic impedance is
matched at both ends, everyone is happy. When it ain't, ya get
delivered power LOSS with the difference between input and
output power being either radiated or absorbed (in something).
Mostly that is excess RADIATION...which CAN be measured.

With an ideal transmission line (and ideal matching front/back)
there isn't any radiation. Power loss is confined to the loss
within the line itself (also measureable). The minute you put
a discontinuity in that line, there WILL BE RADIATION.

low swr