K7ITM wrote:
On Jan 31, 12:31 pm, Cecil Moore wrote:
K7ITM wrote:
To me, having a
linear power scale is a big advantage, because then you can reasonably
accurately figure SWR without having to worry about temperature
compensation of the detectors.
Can you define what you mean by linear? Straight line?
Since we can only measure voltage and current, in order
to obtain a linear power scale from a linear meter, it
is necessary to supply some pre-display computing
ability (microcomputer).
--
73, Cecil http://www.w5dxp.com
And what makes one think that a standard meter is linear (unless your
standards are +/- 10% )
See earlier posting in this thread. See various Avago ap notes, such
as AN 969. A diode detector run at low input provides an output DC
voltage that's a constant times the square of the input RF voltage.
If the input voltage is, or is assumed to be, at some constant
resistive load impedance, the DC output is linear with RF power
input.
Not necessarily true for a coupler..
The proportionality is temperature dependent, but if two
detectors are constructed the same and run at the same temperature,
and run in the signal level region where that relationship holds, then
the ratio of the output DC voltages is a very good approximation of
the ratio of the input RF power levels, and thus is useful for finding
the SWR if the detectors are attached to the forward and reverse ports
of a good directional coupler.
where "good" is the operative word
Top end of the useful "linear power"
range using an HSMS-2850 single diode detector is about 10mV DC
output. If you can measure the DC accurately down to 1uV (a bit
tough, given thermal emfs, but possible), that gives you about a
10000:1 power range, or 100:1 RF input voltage range -- or about
1.02:1 SWR. Chances are very good that a home-built coupler won't be
accurately enough matched to 50+j0 ohms to worry about anything that
low anyway, even if you had a reason to care about it.
Which is why "real instruments" have some form of calibration. With
today's technology, there's really no excuse for not putting some form
of calibration into the logic between measurement and display, unless
all you're looking for is the RF equivalent of a battery and test lamp.
Heck, if you MUST use all analog designs and you're at less than 3GHz,
don't fool with diodes, use the less expensive, more sensitive, and more
accurate power measuring chips from Analog Devices.
Example: AD8310, DC-440MHz, 90+dB dynamic range (-91 to +4dBm) linear to
0.4dB, stable over temp(-40 to +85) +/-1 dB
or the 8319, 1MHz to 10GHz, 40dB range, similar accuracy
they also come in dual versions and versions with phase comparators..
These days, there's relatively few applications where a straight diode
detector would be better:
1) Absolute lowest cost in mass manufacturing with relaxed performance
requirements(so you can use a really cheap silicon diode)
(0.10 for a diode vs $3 for a chip is a $30 difference in the retail
list price)
2) very fast rise time requirements
The chips have response times in the 10 nS and slower range. The right
diode in the right mount can get sub nanoseconds.
3) frequencies 10 GHz
The chips don't go there yet.
Cheers,
Tom