View Single Post
  #7   Report Post  
Old July 9th 03, 12:13 AM
John R. Strohm
 
Posts: n/a
Default

"Clifton T. Sharp Jr." wrote in message
...
I was going through my bookmarks when I came upon one about diodes
(http://uweb.superlink.net/bhtongue/7diodeCv/7diodeCv.html). In his
third paragraph, the author asserts:

For RF diode detectors to work, one needs a device that has a
non-linear V/I curve. In other words, the slope of the V/I curve
must change as a function of applied Voltage. The slope must be
steeper (or shallower) at higher voltages and shallower (or steeper)
at lower voltages than at the quiescent operating point.

If there's any truth to this, someone please enlighten me. I've always
been of the belief that an envelope detector diode would be most perfect
if the diode was a perfect switch, i.e. zero attoamps reverse current
and perfectly linear forward current (as though the diode was a wire
during the forward conduction period). I don't see how a changing slope
during forward conduction could do anything other than distort the
demodulated waveform, especially on tiny signals.


Dude, you and they are talking about the exact same thing.

Think about the V-I curve on an ideal diode. Observe that, if the diode is
reverse-biased, no conduction takes place. In principle, you can crank the
reverse voltage up as high as you like, and STILL no conduction takes place.
(In practice, you run into avalanche and zener breakdown.) If the diode is
forward-biased, the diode conducts like crazy. There is a CORNER in the
curve, i.e. a change in the slope of the V/I curve, at NOMINALLY V=0V.
(Actually, V=0.7V for silicon, V=0.3V for germanium, V=1.7V for red LEDs,
about 2.2V for yellow, about 2.4V for green, and so on and so forth...)