View Single Post
  #10   Report Post  
Old November 27th 03, 09:03 PM
Dave Platt
 
Posts: n/a
Default

The battery measured 12.7V both with and without the charger connected.
So the charger (putting out 13.7V and 500mA) doesn't have enough juice,
er current, to change this. So right now 500mA is being converted to
heat.


I think there may be another interpretation possible.

What does your charger design look like? When it's in a "no load"
situation, and when you're measuring 13.7 volts, is there actually
enough load on the regulator IC's output to ensure proper regulation?
If I recall correctly, and LM317 requires a minimum of 10-20 mA of
load on its output to regulate correctly - without this, the output
voltage creeps up above what you'd expect. It's possible that under
light-to-moderate load (say, 100 mA) your regulator's output voltage
is dropping well below 13.7 and might need to be adjusted.

If you haven't already done this, try the following: stick a
reasonable resistive load on the charger (maybe 30 ohms 5 watts) so
that you're actually drawing an appreciable fraction of the charger's
normal output, and then readjust to 13.7. Also, use an ammeter to
make sure that the regulator is actually working correctly and is
truly delivering the amount of current you expect.

Oh... did you heatsink the regulator? The regulator might be limiting
the current flow (by dropping the output voltage) in order to protect
itself.

I don't think that 500 mA is being converted to heat. I think it's
actively charging the battery, which is probably at least somewhat
"run down". The time you'd see the power being dissipated as heat,
would be when the charger's output had risen up to 13.7 and the
battery was truly being "floated".

I suspect that you've looked at the situation shortly after connecting
the charger to the battery, while the charger was actively charging
the battery to overcome the previous amount of discharge. If you were
to leave the charger connected for a few hours or days, I believe
you'd see that the battery terminal voltage had risen to 13.7 volts,
and that the charger was delivering rather less than its maximum
amount of current. This would be the "battery is fully charged, and
is now being floated" state.

As an example: I have a 45-amp-hour glassmat battery, hooked to a
well-regulated charger (13.5 volts) which is powered from a 450 mA
solar panel. If I hook up the battery after a period of moderate use,
what I see is:

- Before hookup, the battery voltage is somewhere down around 12.3
volts.

- Upon hookup, the charger begins drawing maximum current from the
solar panel. The battery voltage jumps up to around 12.6 volts.
The charger turns on its "I am not limiting the voltage, as the load
is drawing more than my input can supply" light. [If I use a 3-amp
bench supply in place of the solar panel, the battery draws the
full 3 amps at least briefly.]

- Gradually, over a period of an hour or more, the battery voltage
rises upwards, and the current being drawn from the panel slowly
decreases.

- After a few hours, the battery voltage rises to 13.5. The charger
switches into "voltage regulation" mode.

- The current continues to drop off, flattening out to a few tens of
mA after a while and remaining there.

I believe that if you monitor your charger and battery for a period of
time, you will see a very similar pattern of behavior.

This begs the question, what then is the point in regulating the charge
voltage to 13.3V (or 14.2V at freezing temperatures)? Wouldn't a
charger regulated at say 12.9V do just as well at keeping a full charge?


If I recall correctly: the reason for using a slightly higher voltage
has to do with the way that electrical charge is distributed in a
battery. My recollection is that the charge consists (conceptually)
of two parts... a fairly evenly-distributed charge in the plates, and
a "surface charge" on the surfaces of the plates / crystals which is
present during charging.

The distributed charge is what gives you the 12.7 volts... it's the
"steady state" charge within the battery. When you start driving more
current into the battery, the "surface charge" appears (on the
surfaces of the lead sulphide plates and crystals) as the
electrochemical reactions begin to occur. If you stop driving current
in, the surface charge decays away over a period of a few minutes or
hours (or, quite rapidly if you start drawing current from the
battery) and the battery terminal voltage drops back to 12.7 (or
whatever its steady state voltage is).

The surface charge creates an additional voltage, which the charger
must overcome in order to force current into the battery. If you try
to use a 12.9-volt charging circuit, you won't get very much
additional power pushed into the battery before the surface charge
rises to 0.2 volts, the battery terminal voltage rises to 12.9 volts,
and the battery stops charging. If the battery had been somewhat
depleted (say, it was down to 12.3 volts), the surface charge will
still jump up fairly quickly and cut down the charging rate, and it'll
take a long time to "top up" the battery to full charge.

The 13.7-volt setting is, to some extent, a compromise. It's high
enough to allow a battery to be trickle-charged up to full in a
reasonable amount of time (it's high enough to overcome quite a bit of
surface-charge backpressure), but it's not high enough to cause a
fully-charged battery to begin electrolyzing the water out of its cells.

This comes full circle on my original thread postulation. There is NO
point in regulating the voltage, just connect a properly sized wall wart
and you're done. The proof is right here.


The battery makers say you're in error - or, at least,
oversimplifying, and taking risks with your battery. Lots of peoples'
experience says likewise. Go ahead if you wish.

In certain very specific special cases, what you propose _may_ be
safe. These would be the cases where the wall wart's maximum output
current does not exceed the sum of [1] the static load on the battery,
and [2] the amount of self-discharge current and loss-by-heating which
would limit the battery's terminal voltage to no higher than about
13.7 volts. Because the self-discharge, and battery cell voltages are
somewhat temperature-sensitive, I think you'd find that no single
wall-wart would produce optimum results with a single battery under
all circumstances.

In the more general case, one of two things is very likely to be true:

- The wall wart is smaller than ideal, and isn't capable of
delivering enough current to pull the battery up to 13.7 volts in
"steady state" operation. The battery will probably charge, but
more slowly than would otherwise be the case.

- The wall wart is larger than ideal, and it pulls the battery up to
well above the optimal float voltage. The battery begins gassing,
and its life is shortened.

That's why a properly-regulated float-charging circuit is very
desireable. It allows for a rapid recharge if the battery is run down
(because you can use a nice, hefty DC supply) but ensures a stable
floating voltage once the battery reaches steady state. And, a single
such circuit can be used with a wide range of battery capacities - you
don't need to carefully hand-select a wall wart to match each specific
battery.

--
Dave Platt AE6EO
Hosting the Jade Warrior home page: http://www.radagast.org/jade-warrior
I do _not_ wish to receive unsolicited commercial email, and I will
boycott any company which has the gall to send me such ads!