View Single Post
  #4   Report Post  
Old May 3rd 06, 08:58 PM posted to rec.radio.amateur.antenna
K7ITM
 
Posts: n/a
Default Measuring quarter wave cable length with HP 8405A

Something is not as it seems. I think I understand your setup; the
ends of the 75 ohm lines both see 50 ohms when you're measuring, in
both the zero degree "calibration" phase and with the measurement of
the 50 ohm line inserted...assuming it really is 50 ohm line (though
that shouldn't make a difference anyway). I assume you've tried
swapping probes, and you see the 60 degree phase shift the opposite
direction? The phase shift of 30 degrees going from 145 to 180 doesn't
sound right: it sounds like the phase shift across a resonance, not
the phase shift in a length of line versus frequency. What happens if
you take your phase readings at several points across a range of
frequencies, say 100MHz to 250MHz? What happens if you set up like in
your first "calibration" phase, with the two probes at the outer ends
of the two 75 ohms sections, and then compare between putting the 50
ohm load on (say) the B channel, and then putting the 50 ohm line
section between the BNC T and the load? Things shouldn't change, doing
that. And finally, what is the impdeance looking into the probe?

The stubs represented by the BNC Ts will cause some loading, but I
don't think you could account for 30 degrees that way.

Do you have an independent way to check the phase accuracy of your
meter? Like, a 10 ohm resistor across a probe, and a small
capacitance--4.7pF, 10pF or so--from the generator [monitored by the
other probe] to the 10 ohm. That should give you a known phase shift
close to 90 degrees, if you know the capacitance reasonably accurately.
You should be able to know it quite a bit closer than 30 degrees
anyway.

Someone else mentioned taking velocity factor into account, but if you
had measured off half a meter of line, the velocity factor would give
you MORE phase shift, not less.

Cheers,
Tom