Remember Me?

#1
November 26th 03, 07:03 AM
 Bruce W...1 Posts: n/a
Follow-up: Car battery charger

Not long ago and in another thread many of you gave me great advice on
how to make a car battery float charger. I wanted to just connect a
properly sized wall wart, but everyone recommended voltage regulation.
So I connected a voltage regulator (13.6V) to a 500mA wall wart. The
wall wart has an open-circuit voltage of 18V and is rated 500mA at 12 V.

Further background, I built this charger to prevent my having to start a
friends car once a week while they're on extended vacation.

Now two weeks later I check the battery. Its voltage is 12.7V. The
charger circuit measures 13.7V. And I measured the drain, from the
alarm and radio, it is 10mA.

The ambient temperature on average is about 40F.

What went wrong? Why is the battery only 12.7V instead of 13.7?

Lacking a better solution from you guys it seems we need more power,
ugh, ugh. 2A ought to do it.

Spec's say that car batteries (at room temperature) are best regulated
at 13.3V. For 32 degrees F 14.2V is better.

Yet the failure analysis remains incomplete. Where did we go wrong?

#2
November 27th 03, 05:52 AM
 Rick Frazier Posts: n/a

Bruce:

Did you measure the voltage with the charger connected, or after you removed
it?
A standard 12v lead-acid automotive battery has a nominal voltage of 12.6
volts. Even after a float charge, once the charging current is removed, the
you measured would be considered normal for a charged battery.

While the trickle charger is connected to the battery, any current in excess
of that needed to fully charge the battery is converted to heat. You will
only read the charging voltage when the charger is actually connected and
measured, with the rest going to keeping the battery at top charge.

Unless you are measuring the voltage with the charger connected, you
probably don't need to have one with more current.

--Rick

"Bruce W...1" wrote:

Not long ago and in another thread many of you gave me great advice on
how to make a car battery float charger. I wanted to just connect a
properly sized wall wart, but everyone recommended voltage regulation.
So I connected a voltage regulator (13.6V) to a 500mA wall wart. The
wall wart has an open-circuit voltage of 18V and is rated 500mA at 12 V.

Further background, I built this charger to prevent my having to start a
friends car once a week while they're on extended vacation.

Now two weeks later I check the battery. Its voltage is 12.7V. The
charger circuit measures 13.7V. And I measured the drain, from the
alarm and radio, it is 10mA.

The ambient temperature on average is about 40F.

What went wrong? Why is the battery only 12.7V instead of 13.7?

Lacking a better solution from you guys it seems we need more power,
ugh, ugh. 2A ought to do it.

Spec's say that car batteries (at room temperature) are best regulated
at 13.3V. For 32 degrees F 14.2V is better.

Yet the failure analysis remains incomplete. Where did we go wrong?

#3
November 27th 03, 05:52 AM
 Rick Frazier Posts: n/a

Bruce:

Did you measure the voltage with the charger connected, or after you removed
it?
A standard 12v lead-acid automotive battery has a nominal voltage of 12.6
volts. Even after a float charge, once the charging current is removed, the
you measured would be considered normal for a charged battery.

While the trickle charger is connected to the battery, any current in excess
of that needed to fully charge the battery is converted to heat. You will
only read the charging voltage when the charger is actually connected and
measured, with the rest going to keeping the battery at top charge.

Unless you are measuring the voltage with the charger connected, you
probably don't need to have one with more current.

--Rick

"Bruce W...1" wrote:

Not long ago and in another thread many of you gave me great advice on
how to make a car battery float charger. I wanted to just connect a
properly sized wall wart, but everyone recommended voltage regulation.
So I connected a voltage regulator (13.6V) to a 500mA wall wart. The
wall wart has an open-circuit voltage of 18V and is rated 500mA at 12 V.

Further background, I built this charger to prevent my having to start a
friends car once a week while they're on extended vacation.

Now two weeks later I check the battery. Its voltage is 12.7V. The
charger circuit measures 13.7V. And I measured the drain, from the
alarm and radio, it is 10mA.

The ambient temperature on average is about 40F.

What went wrong? Why is the battery only 12.7V instead of 13.7?

Lacking a better solution from you guys it seems we need more power,
ugh, ugh. 2A ought to do it.

Spec's say that car batteries (at room temperature) are best regulated
at 13.3V. For 32 degrees F 14.2V is better.

Yet the failure analysis remains incomplete. Where did we go wrong?

#4
November 27th 03, 05:09 PM
 Bruce W...1 Posts: n/a

Rick Frazier wrote:

Bruce:

Did you measure the voltage with the charger connected, or after you removed
it?
A standard 12v lead-acid automotive battery has a nominal voltage of 12.6
volts. Even after a float charge, once the charging current is removed, the
you measured would be considered normal for a charged battery.

While the trickle charger is connected to the battery, any current in excess
of that needed to fully charge the battery is converted to heat. You will
only read the charging voltage when the charger is actually connected and
measured, with the rest going to keeping the battery at top charge.

Unless you are measuring the voltage with the charger connected, you
probably don't need to have one with more current.

--Rick

================================================== ===========

The battery measured 12.7V both with and without the charger connected.
So the charger (putting out 13.7V and 500mA) doesn't have enough juice,
er current, to change this. So right now 500mA is being converted to
heat.

This begs the question, what then is the point in regulating the charge
voltage to 13.3V (or 14.2V at freezing temperatures)? Wouldn't a
charger regulated at say 12.9V do just as well at keeping a full charge?

This comes full circle on my original thread postulation. There is NO
point in regulating the voltage, just connect a properly sized wall wart
and you're done. The proof is right here.
#5
November 27th 03, 05:09 PM
 Bruce W...1 Posts: n/a

Rick Frazier wrote:

Bruce:

Did you measure the voltage with the charger connected, or after you removed
it?
A standard 12v lead-acid automotive battery has a nominal voltage of 12.6
volts. Even after a float charge, once the charging current is removed, the
you measured would be considered normal for a charged battery.

While the trickle charger is connected to the battery, any current in excess
of that needed to fully charge the battery is converted to heat. You will
only read the charging voltage when the charger is actually connected and
measured, with the rest going to keeping the battery at top charge.

Unless you are measuring the voltage with the charger connected, you
probably don't need to have one with more current.

--Rick

================================================== ===========

The battery measured 12.7V both with and without the charger connected.
So the charger (putting out 13.7V and 500mA) doesn't have enough juice,
er current, to change this. So right now 500mA is being converted to
heat.

This begs the question, what then is the point in regulating the charge
voltage to 13.3V (or 14.2V at freezing temperatures)? Wouldn't a
charger regulated at say 12.9V do just as well at keeping a full charge?

This comes full circle on my original thread postulation. There is NO
point in regulating the voltage, just connect a properly sized wall wart
and you're done. The proof is right here.

#6
November 27th 03, 07:33 PM

On Wed, 26 Nov 2003 01:03:11 -0500 "Bruce W...1"
wrote:

Now two weeks later I check the battery. Its voltage is 12.7V. The
charger circuit measures 13.7V. And I measured the drain, from the
alarm and radio, it is 10mA.

That means that there is 100 Ohms between the PS and the battery. It's
likely that this is the resistance of the meter that you used to
measure the 10mA, and that the actual current without the current
meter was more.

But I still don't understand how you could read 13.7V at the PS and
12.7V at the battery unless there is a significant resistance between
the two. Note that this resistance could be in the ground leg, too.

OTOH, holding the battery voltage at 12.7 will be just fine for long
term storage. Higher voltages will keep it topped up at full charge,
but they also do some long term damage and convert water to hydrogen
and oxygen via hydrolysis.

You're really better off at the lower voltage, and 12.7V is just fine.

-
-----------------------------------------------
-----------------------------------------------
#7
November 27th 03, 07:33 PM

On Wed, 26 Nov 2003 01:03:11 -0500 "Bruce W...1"
wrote:

Now two weeks later I check the battery. Its voltage is 12.7V. The
charger circuit measures 13.7V. And I measured the drain, from the
alarm and radio, it is 10mA.

That means that there is 100 Ohms between the PS and the battery. It's
likely that this is the resistance of the meter that you used to
measure the 10mA, and that the actual current without the current
meter was more.

But I still don't understand how you could read 13.7V at the PS and
12.7V at the battery unless there is a significant resistance between
the two. Note that this resistance could be in the ground leg, too.

OTOH, holding the battery voltage at 12.7 will be just fine for long
term storage. Higher voltages will keep it topped up at full charge,
but they also do some long term damage and convert water to hydrogen
and oxygen via hydrolysis.

You're really better off at the lower voltage, and 12.7V is just fine.

-
-----------------------------------------------
-----------------------------------------------
#8
November 27th 03, 08:10 PM
 [email protected] Posts: n/a

"Bruce W...1" wrote:

Not long ago and in another thread many of you gave me great advice on
how to make a car battery float charger. I wanted to just connect a
properly sized wall wart, but everyone recommended voltage regulation.
So I connected a voltage regulator (13.6V) to a 500mA wall wart. The
wall wart has an open-circuit voltage of 18V and is rated 500mA at 12 V.

Further background, I built this charger to prevent my having to start a
friends car once a week while they're on extended vacation.

Now two weeks later I check the battery. Its voltage is 12.7V. The
charger circuit measures 13.7V. And I measured the drain, from the
alarm and radio, it is 10mA.

The ambient temperature on average is about 40F.

What went wrong? Why is the battery only 12.7V instead of 13.7?

Lacking a better solution from you guys it seems we need more power,
ugh, ugh. 2A ought to do it.

Spec's say that car batteries (at room temperature) are best regulated
at 13.3V. For 32 degrees F 14.2V is better.

Yet the failure analysis remains incomplete. Where did we go wrong?

I don't know how you measured things - so I can't say for
sure - but you may not have a failure.

1) You need to measure the float charge voltage while the
charger is charging the battery. Don't know if you did that,
but 13.7 is good if you did.

2) The battery needs to be fully charged before connecting
the float charger. Don't know if it was. If the battery is
discharged and you connect your float charger and measure it,
you will see a voltage below 13.7 A discharged battery can
draw enough current to drop the output voltage of the wall
wart down below the 13.7 regulation voltage.

3) A battery removed from the float charge will show a lower
voltage than the float voltage. That is normal. So it is
possible that your charger is working properly and the battery
is being held at full charge.
#9
November 27th 03, 08:10 PM
 [email protected] Posts: n/a

"Bruce W...1" wrote:

Not long ago and in another thread many of you gave me great advice on
how to make a car battery float charger. I wanted to just connect a
properly sized wall wart, but everyone recommended voltage regulation.
So I connected a voltage regulator (13.6V) to a 500mA wall wart. The
wall wart has an open-circuit voltage of 18V and is rated 500mA at 12 V.

Further background, I built this charger to prevent my having to start a
friends car once a week while they're on extended vacation.

Now two weeks later I check the battery. Its voltage is 12.7V. The
charger circuit measures 13.7V. And I measured the drain, from the
alarm and radio, it is 10mA.

The ambient temperature on average is about 40F.

What went wrong? Why is the battery only 12.7V instead of 13.7?

Lacking a better solution from you guys it seems we need more power,
ugh, ugh. 2A ought to do it.

Spec's say that car batteries (at room temperature) are best regulated
at 13.3V. For 32 degrees F 14.2V is better.

Yet the failure analysis remains incomplete. Where did we go wrong?

I don't know how you measured things - so I can't say for
sure - but you may not have a failure.

1) You need to measure the float charge voltage while the
charger is charging the battery. Don't know if you did that,
but 13.7 is good if you did.

2) The battery needs to be fully charged before connecting
the float charger. Don't know if it was. If the battery is
discharged and you connect your float charger and measure it,
you will see a voltage below 13.7 A discharged battery can
draw enough current to drop the output voltage of the wall
wart down below the 13.7 regulation voltage.

3) A battery removed from the float charge will show a lower
voltage than the float voltage. That is normal. So it is
possible that your charger is working properly and the battery
is being held at full charge.
#10
November 27th 03, 09:03 PM
 Dave Platt Posts: n/a

The battery measured 12.7V both with and without the charger connected.
So the charger (putting out 13.7V and 500mA) doesn't have enough juice,
er current, to change this. So right now 500mA is being converted to
heat.

I think there may be another interpretation possible.

What does your charger design look like? When it's in a "no load"
situation, and when you're measuring 13.7 volts, is there actually
enough load on the regulator IC's output to ensure proper regulation?
If I recall correctly, and LM317 requires a minimum of 10-20 mA of
load on its output to regulate correctly - without this, the output
voltage creeps up above what you'd expect. It's possible that under
is dropping well below 13.7 and might need to be adjusted.

If you haven't already done this, try the following: stick a
reasonable resistive load on the charger (maybe 30 ohms 5 watts) so
that you're actually drawing an appreciable fraction of the charger's
normal output, and then readjust to 13.7. Also, use an ammeter to
make sure that the regulator is actually working correctly and is
truly delivering the amount of current you expect.

Oh... did you heatsink the regulator? The regulator might be limiting
the current flow (by dropping the output voltage) in order to protect
itself.

I don't think that 500 mA is being converted to heat. I think it's
actively charging the battery, which is probably at least somewhat
"run down". The time you'd see the power being dissipated as heat,
would be when the charger's output had risen up to 13.7 and the
battery was truly being "floated".

I suspect that you've looked at the situation shortly after connecting
the charger to the battery, while the charger was actively charging
the battery to overcome the previous amount of discharge. If you were
to leave the charger connected for a few hours or days, I believe
you'd see that the battery terminal voltage had risen to 13.7 volts,
and that the charger was delivering rather less than its maximum
amount of current. This would be the "battery is fully charged, and
is now being floated" state.

As an example: I have a 45-amp-hour glassmat battery, hooked to a
well-regulated charger (13.5 volts) which is powered from a 450 mA
solar panel. If I hook up the battery after a period of moderate use,
what I see is:

- Before hookup, the battery voltage is somewhere down around 12.3
volts.

- Upon hookup, the charger begins drawing maximum current from the
solar panel. The battery voltage jumps up to around 12.6 volts.
The charger turns on its "I am not limiting the voltage, as the load
is drawing more than my input can supply" light. [If I use a 3-amp
bench supply in place of the solar panel, the battery draws the
full 3 amps at least briefly.]

- Gradually, over a period of an hour or more, the battery voltage
rises upwards, and the current being drawn from the panel slowly
decreases.

- After a few hours, the battery voltage rises to 13.5. The charger
switches into "voltage regulation" mode.

- The current continues to drop off, flattening out to a few tens of
mA after a while and remaining there.

I believe that if you monitor your charger and battery for a period of
time, you will see a very similar pattern of behavior.

This begs the question, what then is the point in regulating the charge
voltage to 13.3V (or 14.2V at freezing temperatures)? Wouldn't a
charger regulated at say 12.9V do just as well at keeping a full charge?

If I recall correctly: the reason for using a slightly higher voltage
has to do with the way that electrical charge is distributed in a
battery. My recollection is that the charge consists (conceptually)
of two parts... a fairly evenly-distributed charge in the plates, and
a "surface charge" on the surfaces of the plates / crystals which is
present during charging.

The distributed charge is what gives you the 12.7 volts... it's the
"steady state" charge within the battery. When you start driving more
current into the battery, the "surface charge" appears (on the
surfaces of the lead sulphide plates and crystals) as the
electrochemical reactions begin to occur. If you stop driving current
in, the surface charge decays away over a period of a few minutes or
hours (or, quite rapidly if you start drawing current from the
battery) and the battery terminal voltage drops back to 12.7 (or
whatever its steady state voltage is).

The surface charge creates an additional voltage, which the charger
must overcome in order to force current into the battery. If you try
to use a 12.9-volt charging circuit, you won't get very much
additional power pushed into the battery before the surface charge
rises to 0.2 volts, the battery terminal voltage rises to 12.9 volts,
and the battery stops charging. If the battery had been somewhat
depleted (say, it was down to 12.3 volts), the surface charge will
still jump up fairly quickly and cut down the charging rate, and it'll
take a long time to "top up" the battery to full charge.

The 13.7-volt setting is, to some extent, a compromise. It's high
enough to allow a battery to be trickle-charged up to full in a
reasonable amount of time (it's high enough to overcome quite a bit of
surface-charge backpressure), but it's not high enough to cause a
fully-charged battery to begin electrolyzing the water out of its cells.

This comes full circle on my original thread postulation. There is NO
point in regulating the voltage, just connect a properly sized wall wart
and you're done. The proof is right here.

The battery makers say you're in error - or, at least,
oversimplifying, and taking risks with your battery. Lots of peoples'
experience says likewise. Go ahead if you wish.

In certain very specific special cases, what you propose _may_ be
safe. These would be the cases where the wall wart's maximum output
current does not exceed the sum of [1] the static load on the battery,
and [2] the amount of self-discharge current and loss-by-heating which
would limit the battery's terminal voltage to no higher than about
13.7 volts. Because the self-discharge, and battery cell voltages are
somewhat temperature-sensitive, I think you'd find that no single
wall-wart would produce optimum results with a single battery under
all circumstances.

In the more general case, one of two things is very likely to be true:

- The wall wart is smaller than ideal, and isn't capable of
delivering enough current to pull the battery up to 13.7 volts in
"steady state" operation. The battery will probably charge, but
more slowly than would otherwise be the case.

- The wall wart is larger than ideal, and it pulls the battery up to
well above the optimal float voltage. The battery begins gassing,
and its life is shortened.

That's why a properly-regulated float-charging circuit is very
desireable. It allows for a rapid recharge if the battery is run down
(because you can use a nice, hefty DC supply) but ensures a stable
floating voltage once the battery reaches steady state. And, a single
such circuit can be used with a wide range of battery capacities - you
don't need to carefully hand-select a wall wart to match each specific
battery.

--
Dave Platt AE6EO
I do _not_ wish to receive unsolicited commercial email, and I will
boycott any company which has the gall to send me such ads!

 Posting Rules Smilies are On [IMG] code is On HTML code is OffTrackbacks are On Pingbacks are On Refbacks are On

 Similar Threads Thread Thread Starter Forum Replies Last Post Pierre Equipment 6 October 12th 04 02:53 PM Pierre Equipment 0 October 12th 04 11:36 AM Pierre Equipment 0 October 12th 04 11:36 AM Bruce W...1 Homebrew 66 November 28th 03 11:36 PM Bruce W...1 Homebrew 0 November 4th 03 01:26 AM

All times are GMT +1. The time now is 03:33 AM.