View Single Post
  #5   Report Post  
Old August 28th 03, 03:17 PM
xpyttl
 
Posts: n/a
Default

The advice about current is a good one. Be particularly watchful of the
specs on odd colored LEDs. White and Blue LEDs often require more voltage
to begin emitting. Many of the brighter LEDs will draw somewhat more
current.

If you are limiting the current with a simple series resistor, presume the
LED will drop the specification voltage, and select the resistor to drop the
remainder of the voltage at the specification current. The problem with
this approach is that you often end up dropping a bunch of volts, and so you
can dissapate a lot of power in the resistor. It can be a little startling
to see a little LED fry a resistor.

Unlike incandescent lamps, which fail abruptly, an LED has a half-life which
is dependent (in a very non-linear way) on the current. A typical LED, run
at the manufacturer's specs, will loose half it's intensity after about 5
years of on time. Decreasing the current will improve this somewhat, but
not a lot. On the other hand, increasing the current will shorten the life
a lot. There is a sort of avalange effect where a little too much current
will quickly fry the LED. It can be quite surprising how much current an
LED can suck up, for a short period of time(!).

When you are close to the specification current, changing the current a
little one way or the other doesn't seem to affect the output all that much.
This is handy if you are concerned, for example, with battery life. By
sacrificing a little light output you can often save quite a bit of current.

...

"W7TI" wrote in message
...
On Wed, 27 Aug 2003 20:25:19 -0700, "bobinphx"
wrote:

To All, I need to understand a few things about LED's, such as amperage,