Reply
 
LinkBack Thread Tools Search this Thread Display Modes
  #11   Report Post  
Old November 27th 03, 08:03 PM
Dave Platt
 
Posts: n/a
Default

The battery measured 12.7V both with and without the charger connected.
So the charger (putting out 13.7V and 500mA) doesn't have enough juice,
er current, to change this. So right now 500mA is being converted to
heat.


I think there may be another interpretation possible.

What does your charger design look like? When it's in a "no load"
situation, and when you're measuring 13.7 volts, is there actually
enough load on the regulator IC's output to ensure proper regulation?
If I recall correctly, and LM317 requires a minimum of 10-20 mA of
load on its output to regulate correctly - without this, the output
voltage creeps up above what you'd expect. It's possible that under
light-to-moderate load (say, 100 mA) your regulator's output voltage
is dropping well below 13.7 and might need to be adjusted.

If you haven't already done this, try the following: stick a
reasonable resistive load on the charger (maybe 30 ohms 5 watts) so
that you're actually drawing an appreciable fraction of the charger's
normal output, and then readjust to 13.7. Also, use an ammeter to
make sure that the regulator is actually working correctly and is
truly delivering the amount of current you expect.

Oh... did you heatsink the regulator? The regulator might be limiting
the current flow (by dropping the output voltage) in order to protect
itself.

I don't think that 500 mA is being converted to heat. I think it's
actively charging the battery, which is probably at least somewhat
"run down". The time you'd see the power being dissipated as heat,
would be when the charger's output had risen up to 13.7 and the
battery was truly being "floated".

I suspect that you've looked at the situation shortly after connecting
the charger to the battery, while the charger was actively charging
the battery to overcome the previous amount of discharge. If you were
to leave the charger connected for a few hours or days, I believe
you'd see that the battery terminal voltage had risen to 13.7 volts,
and that the charger was delivering rather less than its maximum
amount of current. This would be the "battery is fully charged, and
is now being floated" state.

As an example: I have a 45-amp-hour glassmat battery, hooked to a
well-regulated charger (13.5 volts) which is powered from a 450 mA
solar panel. If I hook up the battery after a period of moderate use,
what I see is:

- Before hookup, the battery voltage is somewhere down around 12.3
volts.

- Upon hookup, the charger begins drawing maximum current from the
solar panel. The battery voltage jumps up to around 12.6 volts.
The charger turns on its "I am not limiting the voltage, as the load
is drawing more than my input can supply" light. [If I use a 3-amp
bench supply in place of the solar panel, the battery draws the
full 3 amps at least briefly.]

- Gradually, over a period of an hour or more, the battery voltage
rises upwards, and the current being drawn from the panel slowly
decreases.

- After a few hours, the battery voltage rises to 13.5. The charger
switches into "voltage regulation" mode.

- The current continues to drop off, flattening out to a few tens of
mA after a while and remaining there.

I believe that if you monitor your charger and battery for a period of
time, you will see a very similar pattern of behavior.

This begs the question, what then is the point in regulating the charge
voltage to 13.3V (or 14.2V at freezing temperatures)? Wouldn't a
charger regulated at say 12.9V do just as well at keeping a full charge?


If I recall correctly: the reason for using a slightly higher voltage
has to do with the way that electrical charge is distributed in a
battery. My recollection is that the charge consists (conceptually)
of two parts... a fairly evenly-distributed charge in the plates, and
a "surface charge" on the surfaces of the plates / crystals which is
present during charging.

The distributed charge is what gives you the 12.7 volts... it's the
"steady state" charge within the battery. When you start driving more
current into the battery, the "surface charge" appears (on the
surfaces of the lead sulphide plates and crystals) as the
electrochemical reactions begin to occur. If you stop driving current
in, the surface charge decays away over a period of a few minutes or
hours (or, quite rapidly if you start drawing current from the
battery) and the battery terminal voltage drops back to 12.7 (or
whatever its steady state voltage is).

The surface charge creates an additional voltage, which the charger
must overcome in order to force current into the battery. If you try
to use a 12.9-volt charging circuit, you won't get very much
additional power pushed into the battery before the surface charge
rises to 0.2 volts, the battery terminal voltage rises to 12.9 volts,
and the battery stops charging. If the battery had been somewhat
depleted (say, it was down to 12.3 volts), the surface charge will
still jump up fairly quickly and cut down the charging rate, and it'll
take a long time to "top up" the battery to full charge.

The 13.7-volt setting is, to some extent, a compromise. It's high
enough to allow a battery to be trickle-charged up to full in a
reasonable amount of time (it's high enough to overcome quite a bit of
surface-charge backpressure), but it's not high enough to cause a
fully-charged battery to begin electrolyzing the water out of its cells.

This comes full circle on my original thread postulation. There is NO
point in regulating the voltage, just connect a properly sized wall wart
and you're done. The proof is right here.


The battery makers say you're in error - or, at least,
oversimplifying, and taking risks with your battery. Lots of peoples'
experience says likewise. Go ahead if you wish.

In certain very specific special cases, what you propose _may_ be
safe. These would be the cases where the wall wart's maximum output
current does not exceed the sum of [1] the static load on the battery,
and [2] the amount of self-discharge current and loss-by-heating which
would limit the battery's terminal voltage to no higher than about
13.7 volts. Because the self-discharge, and battery cell voltages are
somewhat temperature-sensitive, I think you'd find that no single
wall-wart would produce optimum results with a single battery under
all circumstances.

In the more general case, one of two things is very likely to be true:

- The wall wart is smaller than ideal, and isn't capable of
delivering enough current to pull the battery up to 13.7 volts in
"steady state" operation. The battery will probably charge, but
more slowly than would otherwise be the case.

- The wall wart is larger than ideal, and it pulls the battery up to
well above the optimal float voltage. The battery begins gassing,
and its life is shortened.

That's why a properly-regulated float-charging circuit is very
desireable. It allows for a rapid recharge if the battery is run down
(because you can use a nice, hefty DC supply) but ensures a stable
floating voltage once the battery reaches steady state. And, a single
such circuit can be used with a wide range of battery capacities - you
don't need to carefully hand-select a wall wart to match each specific
battery.

--
Dave Platt AE6EO
Hosting the Jade Warrior home page: http://www.radagast.org/jade-warrior
I do _not_ wish to receive unsolicited commercial email, and I will
boycott any company which has the gall to send me such ads!
  #12   Report Post  
Old November 27th 03, 11:59 PM
Bob Lewis \(AA4PB\)
 
Posts: n/a
Default

You have a couple of potential problems with using an unregulated
charger to maintain or charge a lead-acid battery. The first problem
is that if the battery attempts to draw more current than the
unregulated supply can handle you may overhead the transformer or
other components in the supply. If on the other hand, the supply can
supply more current than the maximum bulk charge rating of the battery
then you may overheat and damage the battery if it is connected to the
supply when it is not fully charged. The third problem results if the
supply can output a voltage higher than 13.8 volts and can also supply
the necessary charging current. The battery voltage will eventually
climb to the supply voltage (above 13.8 volts), continue to draw
charging current and boil the water out of the cells damaging the
battery.

To be safe, you really need to regulate the voltage at 13.8V maximum
AND limit the current to protect the battery and the charger. If you
also want to get a fast charge on a discharged battery then you need a
multi-state charger that will apply a higher voltage (about 14.5 V) at
a limited current until the battery is almost fully charged and then
switch to 13.8 V to top it off and maintain a "float" charge.

I expect your battery did not reach the no-load supply voltage because
the supply was not capable of producing that voltage at the trickle
current needed by the battery.


  #13   Report Post  
Old November 27th 03, 11:59 PM
Bob Lewis \(AA4PB\)
 
Posts: n/a
Default

You have a couple of potential problems with using an unregulated
charger to maintain or charge a lead-acid battery. The first problem
is that if the battery attempts to draw more current than the
unregulated supply can handle you may overhead the transformer or
other components in the supply. If on the other hand, the supply can
supply more current than the maximum bulk charge rating of the battery
then you may overheat and damage the battery if it is connected to the
supply when it is not fully charged. The third problem results if the
supply can output a voltage higher than 13.8 volts and can also supply
the necessary charging current. The battery voltage will eventually
climb to the supply voltage (above 13.8 volts), continue to draw
charging current and boil the water out of the cells damaging the
battery.

To be safe, you really need to regulate the voltage at 13.8V maximum
AND limit the current to protect the battery and the charger. If you
also want to get a fast charge on a discharged battery then you need a
multi-state charger that will apply a higher voltage (about 14.5 V) at
a limited current until the battery is almost fully charged and then
switch to 13.8 V to top it off and maintain a "float" charge.

I expect your battery did not reach the no-load supply voltage because
the supply was not capable of producing that voltage at the trickle
current needed by the battery.


  #14   Report Post  
Old November 28th 03, 05:32 AM
Andrew VK3BFA
 
Posts: n/a
Default

"Bruce W...1" wrote in message ...
Not long ago and in another thread many of you gave me great advice on
how to make a car battery float charger. I wanted to just connect a
properly sized wall wart, but everyone recommended voltage regulation.
So I connected a voltage regulator (13.6V) to a 500mA wall wart. The
wall wart has an open-circuit voltage of 18V and is rated 500mA at 12 V.

Further background, I built this charger to prevent my having to start a
friends car once a week while they're on extended vacation.

Now two weeks later I check the battery. Its voltage is 12.7V. The
charger circuit measures 13.7V. And I measured the drain, from the
alarm and radio, it is 10mA.

The ambient temperature on average is about 40F.

What went wrong? Why is the battery only 12.7V instead of 13.7?

Lacking a better solution from you guys it seems we need more power,
ugh, ugh. 2A ought to do it.

Spec's say that car batteries (at room temperature) are best regulated
at 13.3V. For 32 degrees F 14.2V is better.

Yet the failure analysis remains incomplete. Where did we go wrong?

Thanks for your help.



**** a Duck , Bruce,
this is becoming increasingly metaphysical - the mental effort you
(and everyone else) is putting into debating a car battery is
ludicrous. Let me make a few points.

1. If the battery is more than 4 years old its probably stuffed or
close to it. Sad but true.

2. Go and buy a hygrometer (they are about $3 - people used them
before digital multimeters were invented) - have a look at the SG in
the cells. If its green,
its OK. Check all cells, if 1 or 2 are very different SG then its
stuffed.

3. Do a load test on the thing, turn on all the lights and see how
much the voltage drops. Leave them on for 0.5 hour, if it drops much
below 12v then its stuffed.

How much does a new battery cost anyway?.........

de VK3BFA ANdrew
  #15   Report Post  
Old November 28th 03, 05:32 AM
Andrew VK3BFA
 
Posts: n/a
Default

"Bruce W...1" wrote in message ...
Not long ago and in another thread many of you gave me great advice on
how to make a car battery float charger. I wanted to just connect a
properly sized wall wart, but everyone recommended voltage regulation.
So I connected a voltage regulator (13.6V) to a 500mA wall wart. The
wall wart has an open-circuit voltage of 18V and is rated 500mA at 12 V.

Further background, I built this charger to prevent my having to start a
friends car once a week while they're on extended vacation.

Now two weeks later I check the battery. Its voltage is 12.7V. The
charger circuit measures 13.7V. And I measured the drain, from the
alarm and radio, it is 10mA.

The ambient temperature on average is about 40F.

What went wrong? Why is the battery only 12.7V instead of 13.7?

Lacking a better solution from you guys it seems we need more power,
ugh, ugh. 2A ought to do it.

Spec's say that car batteries (at room temperature) are best regulated
at 13.3V. For 32 degrees F 14.2V is better.

Yet the failure analysis remains incomplete. Where did we go wrong?

Thanks for your help.



**** a Duck , Bruce,
this is becoming increasingly metaphysical - the mental effort you
(and everyone else) is putting into debating a car battery is
ludicrous. Let me make a few points.

1. If the battery is more than 4 years old its probably stuffed or
close to it. Sad but true.

2. Go and buy a hygrometer (they are about $3 - people used them
before digital multimeters were invented) - have a look at the SG in
the cells. If its green,
its OK. Check all cells, if 1 or 2 are very different SG then its
stuffed.

3. Do a load test on the thing, turn on all the lights and see how
much the voltage drops. Leave them on for 0.5 hour, if it drops much
below 12v then its stuffed.

How much does a new battery cost anyway?.........

de VK3BFA ANdrew


  #16   Report Post  
Old November 28th 03, 04:31 PM
Bruce W...1
 
Posts: n/a
Default

wrote:

I don't know how you measured things - so I can't say for
sure - but you may not have a failure.

1) You need to measure the float charge voltage while the
charger is charging the battery. Don't know if you did that,
but 13.7 is good if you did.

2) The battery needs to be fully charged before connecting
the float charger. Don't know if it was. If the battery is
discharged and you connect your float charger and measure it,
you will see a voltage below 13.7 A discharged battery can
draw enough current to drop the output voltage of the wall
wart down below the 13.7 regulation voltage.

3) A battery removed from the float charge will show a lower
voltage than the float voltage. That is normal. So it is
possible that your charger is working properly and the battery
is being held at full charge.

================================================== ===========

The battery was fully charged when the float charging was started. The
battery is almost new. The float voltage measured 12.7V with the
charger connected. And the regulator is heat-sinked.

Someone outside of this thread who is more knowledgeable in this matter
than I told me the following.

A float voltage of 13.3V is required to maintain a fully charged state
(at room temperature). At lower voltages the battery loses charge,
regardless of the output of the charger. So if the charger doesn't have
enough current to keep it at 13.3V, as is the case here, then charge
will be lost. If this is true then I should see a lower float voltage
in the near future.

It's also become clear that regulating the voltage of an under-sized
charger is pointless, because the battery never reaches a high voltage
anyway.

Bob's point about overloading the charger is certainly valid. But right
now it's only pulling a tiny current because the voltage differential is
so small.

One conclusion can be drawn from all of this. The charger I built is
inadequate for long-term care. And the wall wart chargers that are sold
for float charging are not suitable for long-term charging if they can't
keep the battery at 13.3V. I'm guessing you need at least 2 Amps to do
this. However an under-sized wall wart can certainly reduce the rate of
discharge by compensating for external loads.

So what my home-brew charger is doing is just compensating for external
loads and not adding to the battery charge in any way.

A lead-acid battery is not damaged until it falls below 12.0V. How long
does it take a healthy battery to self-discharge to 12.0V? This might
take a year. I don't have a feel for this at lower temperatures.

My charger will probably get the battery thru the winter, and certainly
if I start the car every six weeks or so. So I think I'll just leave it
at that. Thanks all for your help.

On another battery front, the gel cell in my computer UPS died of old
age. Rather than replacing the battery I reconnected the UPS to a 32Ah
gel cell which I keep around for emergency preparedness. This kills two
birds with one stone, it keeps the big battery charged and also gives
the UPS a whole lot of capacity. Now that I think about it, an old UPS
might make a dynamite car battery float charger.
  #17   Report Post  
Old November 28th 03, 04:31 PM
Bruce W...1
 
Posts: n/a
Default

wrote:

I don't know how you measured things - so I can't say for
sure - but you may not have a failure.

1) You need to measure the float charge voltage while the
charger is charging the battery. Don't know if you did that,
but 13.7 is good if you did.

2) The battery needs to be fully charged before connecting
the float charger. Don't know if it was. If the battery is
discharged and you connect your float charger and measure it,
you will see a voltage below 13.7 A discharged battery can
draw enough current to drop the output voltage of the wall
wart down below the 13.7 regulation voltage.

3) A battery removed from the float charge will show a lower
voltage than the float voltage. That is normal. So it is
possible that your charger is working properly and the battery
is being held at full charge.

================================================== ===========

The battery was fully charged when the float charging was started. The
battery is almost new. The float voltage measured 12.7V with the
charger connected. And the regulator is heat-sinked.

Someone outside of this thread who is more knowledgeable in this matter
than I told me the following.

A float voltage of 13.3V is required to maintain a fully charged state
(at room temperature). At lower voltages the battery loses charge,
regardless of the output of the charger. So if the charger doesn't have
enough current to keep it at 13.3V, as is the case here, then charge
will be lost. If this is true then I should see a lower float voltage
in the near future.

It's also become clear that regulating the voltage of an under-sized
charger is pointless, because the battery never reaches a high voltage
anyway.

Bob's point about overloading the charger is certainly valid. But right
now it's only pulling a tiny current because the voltage differential is
so small.

One conclusion can be drawn from all of this. The charger I built is
inadequate for long-term care. And the wall wart chargers that are sold
for float charging are not suitable for long-term charging if they can't
keep the battery at 13.3V. I'm guessing you need at least 2 Amps to do
this. However an under-sized wall wart can certainly reduce the rate of
discharge by compensating for external loads.

So what my home-brew charger is doing is just compensating for external
loads and not adding to the battery charge in any way.

A lead-acid battery is not damaged until it falls below 12.0V. How long
does it take a healthy battery to self-discharge to 12.0V? This might
take a year. I don't have a feel for this at lower temperatures.

My charger will probably get the battery thru the winter, and certainly
if I start the car every six weeks or so. So I think I'll just leave it
at that. Thanks all for your help.

On another battery front, the gel cell in my computer UPS died of old
age. Rather than replacing the battery I reconnected the UPS to a 32Ah
gel cell which I keep around for emergency preparedness. This kills two
birds with one stone, it keeps the big battery charged and also gives
the UPS a whole lot of capacity. Now that I think about it, an old UPS
might make a dynamite car battery float charger.
  #18   Report Post  
Old November 29th 03, 04:17 PM
Bruce W...1
 
Posts: n/a
Default

Dave Platt wrote:

The battery measured 12.7V both with and without the charger connected.
So the charger (putting out 13.7V and 500mA) doesn't have enough juice,
er current, to change this. So right now 500mA is being converted to
heat.


I think there may be another interpretation possible.

What does your charger design look like? When it's in a "no load"
situation, and when you're measuring 13.7 volts, is there actually
enough load on the regulator IC's output to ensure proper regulation?
If I recall correctly, and LM317 requires a minimum of 10-20 mA of
load on its output to regulate correctly - without this, the output
voltage creeps up above what you'd expect. It's possible that under
light-to-moderate load (say, 100 mA) your regulator's output voltage
is dropping well below 13.7 and might need to be adjusted.

If you haven't already done this, try the following: stick a
reasonable resistive load on the charger (maybe 30 ohms 5 watts) so
that you're actually drawing an appreciable fraction of the charger's
normal output, and then readjust to 13.7. Also, use an ammeter to
make sure that the regulator is actually working correctly and is
truly delivering the amount of current you expect.

Oh... did you heatsink the regulator? The regulator might be limiting
the current flow (by dropping the output voltage) in order to protect
itself.

I don't think that 500 mA is being converted to heat. I think it's
actively charging the battery, which is probably at least somewhat
"run down". The time you'd see the power being dissipated as heat,
would be when the charger's output had risen up to 13.7 and the
battery was truly being "floated".

I suspect that you've looked at the situation shortly after connecting
the charger to the battery, while the charger was actively charging
the battery to overcome the previous amount of discharge. If you were
to leave the charger connected for a few hours or days, I believe
you'd see that the battery terminal voltage had risen to 13.7 volts,
and that the charger was delivering rather less than its maximum
amount of current. This would be the "battery is fully charged, and
is now being floated" state.

As an example: I have a 45-amp-hour glassmat battery, hooked to a
well-regulated charger (13.5 volts) which is powered from a 450 mA
solar panel. If I hook up the battery after a period of moderate use,
what I see is:

- Before hookup, the battery voltage is somewhere down around 12.3
volts.

- Upon hookup, the charger begins drawing maximum current from the
solar panel. The battery voltage jumps up to around 12.6 volts.
The charger turns on its "I am not limiting the voltage, as the load
is drawing more than my input can supply" light. [If I use a 3-amp
bench supply in place of the solar panel, the battery draws the
full 3 amps at least briefly.]

- Gradually, over a period of an hour or more, the battery voltage
rises upwards, and the current being drawn from the panel slowly
decreases.

- After a few hours, the battery voltage rises to 13.5. The charger
switches into "voltage regulation" mode.

- The current continues to drop off, flattening out to a few tens of
mA after a while and remaining there.

I believe that if you monitor your charger and battery for a period of
time, you will see a very similar pattern of behavior.

This begs the question, what then is the point in regulating the charge
voltage to 13.3V (or 14.2V at freezing temperatures)? Wouldn't a
charger regulated at say 12.9V do just as well at keeping a full charge?


If I recall correctly: the reason for using a slightly higher voltage
has to do with the way that electrical charge is distributed in a
battery. My recollection is that the charge consists (conceptually)
of two parts... a fairly evenly-distributed charge in the plates, and
a "surface charge" on the surfaces of the plates / crystals which is
present during charging.

The distributed charge is what gives you the 12.7 volts... it's the
"steady state" charge within the battery. When you start driving more
current into the battery, the "surface charge" appears (on the
surfaces of the lead sulphide plates and crystals) as the
electrochemical reactions begin to occur. If you stop driving current
in, the surface charge decays away over a period of a few minutes or
hours (or, quite rapidly if you start drawing current from the
battery) and the battery terminal voltage drops back to 12.7 (or
whatever its steady state voltage is).

The surface charge creates an additional voltage, which the charger
must overcome in order to force current into the battery. If you try
to use a 12.9-volt charging circuit, you won't get very much
additional power pushed into the battery before the surface charge
rises to 0.2 volts, the battery terminal voltage rises to 12.9 volts,
and the battery stops charging. If the battery had been somewhat
depleted (say, it was down to 12.3 volts), the surface charge will
still jump up fairly quickly and cut down the charging rate, and it'll
take a long time to "top up" the battery to full charge.

The 13.7-volt setting is, to some extent, a compromise. It's high
enough to allow a battery to be trickle-charged up to full in a
reasonable amount of time (it's high enough to overcome quite a bit of
surface-charge backpressure), but it's not high enough to cause a
fully-charged battery to begin electrolyzing the water out of its cells.

This comes full circle on my original thread postulation. There is NO
point in regulating the voltage, just connect a properly sized wall wart
and you're done. The proof is right here.


The battery makers say you're in error - or, at least,
oversimplifying, and taking risks with your battery. Lots of peoples'
experience says likewise. Go ahead if you wish.

In certain very specific special cases, what you propose _may_ be
safe. These would be the cases where the wall wart's maximum output
current does not exceed the sum of [1] the static load on the battery,
and [2] the amount of self-discharge current and loss-by-heating which
would limit the battery's terminal voltage to no higher than about
13.7 volts. Because the self-discharge, and battery cell voltages are
somewhat temperature-sensitive, I think you'd find that no single
wall-wart would produce optimum results with a single battery under
all circumstances.

In the more general case, one of two things is very likely to be true:

- The wall wart is smaller than ideal, and isn't capable of
delivering enough current to pull the battery up to 13.7 volts in
"steady state" operation. The battery will probably charge, but
more slowly than would otherwise be the case.

- The wall wart is larger than ideal, and it pulls the battery up to
well above the optimal float voltage. The battery begins gassing,
and its life is shortened.

That's why a properly-regulated float-charging circuit is very
desireable. It allows for a rapid recharge if the battery is run down
(because you can use a nice, hefty DC supply) but ensures a stable
floating voltage once the battery reaches steady state. And, a single
such circuit can be used with a wide range of battery capacities - you
don't need to carefully hand-select a wall wart to match each specific
battery.

--
Dave Platt AE6EO
Hosting the Jade Warrior home page: http://www.radagast.org/jade-warrior
I do _not_ wish to receive unsolicited commercial email, and I will
boycott any company which has the gall to send me such ads!


================================================== ===============

You may have something here. It would sure explain a lot. I need to do
some more testing. Thanks.
  #19   Report Post  
Old November 29th 03, 04:17 PM
Bruce W...1
 
Posts: n/a
Default

Dave Platt wrote:

The battery measured 12.7V both with and without the charger connected.
So the charger (putting out 13.7V and 500mA) doesn't have enough juice,
er current, to change this. So right now 500mA is being converted to
heat.


I think there may be another interpretation possible.

What does your charger design look like? When it's in a "no load"
situation, and when you're measuring 13.7 volts, is there actually
enough load on the regulator IC's output to ensure proper regulation?
If I recall correctly, and LM317 requires a minimum of 10-20 mA of
load on its output to regulate correctly - without this, the output
voltage creeps up above what you'd expect. It's possible that under
light-to-moderate load (say, 100 mA) your regulator's output voltage
is dropping well below 13.7 and might need to be adjusted.

If you haven't already done this, try the following: stick a
reasonable resistive load on the charger (maybe 30 ohms 5 watts) so
that you're actually drawing an appreciable fraction of the charger's
normal output, and then readjust to 13.7. Also, use an ammeter to
make sure that the regulator is actually working correctly and is
truly delivering the amount of current you expect.

Oh... did you heatsink the regulator? The regulator might be limiting
the current flow (by dropping the output voltage) in order to protect
itself.

I don't think that 500 mA is being converted to heat. I think it's
actively charging the battery, which is probably at least somewhat
"run down". The time you'd see the power being dissipated as heat,
would be when the charger's output had risen up to 13.7 and the
battery was truly being "floated".

I suspect that you've looked at the situation shortly after connecting
the charger to the battery, while the charger was actively charging
the battery to overcome the previous amount of discharge. If you were
to leave the charger connected for a few hours or days, I believe
you'd see that the battery terminal voltage had risen to 13.7 volts,
and that the charger was delivering rather less than its maximum
amount of current. This would be the "battery is fully charged, and
is now being floated" state.

As an example: I have a 45-amp-hour glassmat battery, hooked to a
well-regulated charger (13.5 volts) which is powered from a 450 mA
solar panel. If I hook up the battery after a period of moderate use,
what I see is:

- Before hookup, the battery voltage is somewhere down around 12.3
volts.

- Upon hookup, the charger begins drawing maximum current from the
solar panel. The battery voltage jumps up to around 12.6 volts.
The charger turns on its "I am not limiting the voltage, as the load
is drawing more than my input can supply" light. [If I use a 3-amp
bench supply in place of the solar panel, the battery draws the
full 3 amps at least briefly.]

- Gradually, over a period of an hour or more, the battery voltage
rises upwards, and the current being drawn from the panel slowly
decreases.

- After a few hours, the battery voltage rises to 13.5. The charger
switches into "voltage regulation" mode.

- The current continues to drop off, flattening out to a few tens of
mA after a while and remaining there.

I believe that if you monitor your charger and battery for a period of
time, you will see a very similar pattern of behavior.

This begs the question, what then is the point in regulating the charge
voltage to 13.3V (or 14.2V at freezing temperatures)? Wouldn't a
charger regulated at say 12.9V do just as well at keeping a full charge?


If I recall correctly: the reason for using a slightly higher voltage
has to do with the way that electrical charge is distributed in a
battery. My recollection is that the charge consists (conceptually)
of two parts... a fairly evenly-distributed charge in the plates, and
a "surface charge" on the surfaces of the plates / crystals which is
present during charging.

The distributed charge is what gives you the 12.7 volts... it's the
"steady state" charge within the battery. When you start driving more
current into the battery, the "surface charge" appears (on the
surfaces of the lead sulphide plates and crystals) as the
electrochemical reactions begin to occur. If you stop driving current
in, the surface charge decays away over a period of a few minutes or
hours (or, quite rapidly if you start drawing current from the
battery) and the battery terminal voltage drops back to 12.7 (or
whatever its steady state voltage is).

The surface charge creates an additional voltage, which the charger
must overcome in order to force current into the battery. If you try
to use a 12.9-volt charging circuit, you won't get very much
additional power pushed into the battery before the surface charge
rises to 0.2 volts, the battery terminal voltage rises to 12.9 volts,
and the battery stops charging. If the battery had been somewhat
depleted (say, it was down to 12.3 volts), the surface charge will
still jump up fairly quickly and cut down the charging rate, and it'll
take a long time to "top up" the battery to full charge.

The 13.7-volt setting is, to some extent, a compromise. It's high
enough to allow a battery to be trickle-charged up to full in a
reasonable amount of time (it's high enough to overcome quite a bit of
surface-charge backpressure), but it's not high enough to cause a
fully-charged battery to begin electrolyzing the water out of its cells.

This comes full circle on my original thread postulation. There is NO
point in regulating the voltage, just connect a properly sized wall wart
and you're done. The proof is right here.


The battery makers say you're in error - or, at least,
oversimplifying, and taking risks with your battery. Lots of peoples'
experience says likewise. Go ahead if you wish.

In certain very specific special cases, what you propose _may_ be
safe. These would be the cases where the wall wart's maximum output
current does not exceed the sum of [1] the static load on the battery,
and [2] the amount of self-discharge current and loss-by-heating which
would limit the battery's terminal voltage to no higher than about
13.7 volts. Because the self-discharge, and battery cell voltages are
somewhat temperature-sensitive, I think you'd find that no single
wall-wart would produce optimum results with a single battery under
all circumstances.

In the more general case, one of two things is very likely to be true:

- The wall wart is smaller than ideal, and isn't capable of
delivering enough current to pull the battery up to 13.7 volts in
"steady state" operation. The battery will probably charge, but
more slowly than would otherwise be the case.

- The wall wart is larger than ideal, and it pulls the battery up to
well above the optimal float voltage. The battery begins gassing,
and its life is shortened.

That's why a properly-regulated float-charging circuit is very
desireable. It allows for a rapid recharge if the battery is run down
(because you can use a nice, hefty DC supply) but ensures a stable
floating voltage once the battery reaches steady state. And, a single
such circuit can be used with a wide range of battery capacities - you
don't need to carefully hand-select a wall wart to match each specific
battery.

--
Dave Platt AE6EO
Hosting the Jade Warrior home page: http://www.radagast.org/jade-warrior
I do _not_ wish to receive unsolicited commercial email, and I will
boycott any company which has the gall to send me such ads!


================================================== ===============

You may have something here. It would sure explain a lot. I need to do
some more testing. Thanks.
  #20   Report Post  
Old December 1st 03, 04:38 AM
 
Posts: n/a
Default



"Bruce W...1" wrote:

wrote:

I don't know how you measured things - so I can't say for
sure - but you may not have a failure.

1) You need to measure the float charge voltage while the
charger is charging the battery. Don't know if you did that,
but 13.7 is good if you did.

2) The battery needs to be fully charged before connecting
the float charger. Don't know if it was. If the battery is
discharged and you connect your float charger and measure it,
you will see a voltage below 13.7 A discharged battery can
draw enough current to drop the output voltage of the wall
wart down below the 13.7 regulation voltage.

3) A battery removed from the float charge will show a lower
voltage than the float voltage. That is normal. So it is
possible that your charger is working properly and the battery
is being held at full charge.

================================================== ===========

The battery was fully charged when the float charging was started. The
battery is almost new. The float voltage measured 12.7V with the
charger connected. And the regulator is heat-sinked.

Someone outside of this thread who is more knowledgeable in this matter
than I told me the following.

A float voltage of 13.3V is required to maintain a fully charged state
(at room temperature). At lower voltages the battery loses charge,
regardless of the output of the charger. So if the charger doesn't have
enough current to keep it at 13.3V, as is the case here, then charge
will be lost. If this is true then I should see a lower float voltage
in the near future.

It's also become clear that regulating the voltage of an under-sized
charger is pointless, because the battery never reaches a high voltage
anyway.

Bob's point about overloading the charger is certainly valid. But right
now it's only pulling a tiny current because the voltage differential is
so small.

One conclusion can be drawn from all of this. The charger I built is
inadequate for long-term care. And the wall wart chargers that are sold
for float charging are not suitable for long-term charging if they can't
keep the battery at 13.3V. I'm guessing you need at least 2 Amps to do
this. However an under-sized wall wart can certainly reduce the rate of
discharge by compensating for external loads.

So what my home-brew charger is doing is just compensating for external
loads and not adding to the battery charge in any way.

A lead-acid battery is not damaged until it falls below 12.0V. How long
does it take a healthy battery to self-discharge to 12.0V? This might
take a year. I don't have a feel for this at lower temperatures.

My charger will probably get the battery thru the winter, and certainly
if I start the car every six weeks or so. So I think I'll just leave it
at that. Thanks all for your help.

On another battery front, the gel cell in my computer UPS died of old
age. Rather than replacing the battery I reconnected the UPS to a 32Ah
gel cell which I keep around for emergency preparedness. This kills two
birds with one stone, it keeps the big battery charged and also gives
the UPS a whole lot of capacity. Now that I think about it, an old UPS
might make a dynamite car battery float charger.



Some points:
1) The input to the regulator must be about 2 volts above the
regulated voltage level. So, if your regulator is set for
13.7, the DC input to the regulator must be about 15.7.
Under no load, what is the DC voltage at the input to the
regulator? What is it under full load?

2) A wall wart's output voltage will sag under load - the
heavier the load, the greater the sag. The ones that don't
sag have the regulator built in. How much current is being
drawn from the regulator when it is connected to the battery?
What is the wall wart voltage sag, no load to full load?

3) If as you mentioned the voltage differential is so small
that very little current is being drawn from the charger, then
there should not be a differential of 13.7 to 12.7, no load to
full load as measured at the charger terminals that clip
on to the battery. The problem is that the phrase "very little
current" is undefined. We need the actual numbers.

Bottom line - it sounds like your wall wart may be too wimpy
for this application. Also, it would be a good idea to
post the details of the circuit. For example, do you have
a diode in the output between the regulator and the battery?
If not, how do you protect the LM317, and how do you prevent
the battery from discharging through the charger?

In your reply you mentioned:
It's also become clear that regulating the voltage of an under-sized
charger is pointless, because the battery never reaches a high voltage
anyway.


It's not the regulator that's pointless, it's using an
under-sized charger in the first place, and expecting it
to keep the battery at ~13.7. When you expect a charger
to keep a battery at ~13.7, regulation is required. When
you don't care what the battery voltage is, no regulation
is required. Of course, that's no charger at all - it could
allow the voltage to go anywhere, and it disagrees with what
battery manufacturers recommend - voltage regulation for
trickle charge.
Reply
Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules

Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Battery charger and voltage converter Pierre Equipment 6 October 12th 04 02:53 PM
Battery charger and voltage converter Pierre Equipment 0 October 12th 04 11:36 AM
Battery charger and voltage converter Pierre Equipment 0 October 12th 04 11:36 AM
Car battery trickle charger? Bruce W...1 Homebrew 66 November 28th 03 10:36 PM
Car battery trickle charger? Bruce W...1 Homebrew 0 November 4th 03 12:26 AM


All times are GMT +1. The time now is 04:38 AM.

Powered by vBulletin® Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 RadioBanter.
The comments are property of their posters.
 

About Us

"It's about Radio"

 

Copyright © 2017