Hi All,
What is the best practice for LiFePo4 charging in terms of absorption and float voltages?
Talking about two types of systems - with standalone chargers or systems running temporary as standalone - for example a Victron system with GX device disconnected or failed for some reason.
I am a fan of pushing the cells to 3.45V and set my absorption voltage to 55.2V (for 16S battery bank). However there are different approaches as far as it concerns the float voltage. Is it best to set the float voltage the same as the absorption one for lifepo4 battery banks as explained in this video by Andy or it is better to lower the float voltage to 3.35V per cell?
I am assuming that it is best to set abs=float for standalone systems to get the max out of the solar and set float lower than absorption for temp standalone systems (no GX device for some time) so we can keep the system safer during that time.
What battery are you referring to. Each brand has their recommendation and i tent to setup my system accordingly. Freedom won ask for other settings than BSL and both being a 16 cell battery. I guess you see where I am going with this.
I have installed batteries that require 55v CVL, other 57.6V and yet others 55.8V. Its all over the place, looking at your specific battery might shed light on the correct answer.
I agree with Andy on this.
I think that float charge voltage is historical from LA batteries.
More important for Li-Ion batteries is charging your batteries to 100% so that the BMS balances the cells. See The low-down on managing LifePO4 batteries
The latest wisdom on the topic seems to be to keep a lower average (around 50%-ish) while still charging it to over 3.45V per cell to ensure it does not get out of balance.
A battery that is in active use as a storage device for solar (ie typical ESS) already runs like that. It fills up to 100% every day, or a few times a week, around the late afternoon, and then it starts discharging again as you move into the evening, spending the next 12 hours at a lower SOC. Therefore the battery spends most of its time at a lower average, and you get essentially exactly what you need.
3.45V, 3.5V, 3.55V, etc. how long do you hold it at that voltage to achieve 100%?
It is interesting that the same chemistry is treated differently by different pack manufacturers! Then again there might (although I doubt it) be small chemistry differences that justify treating them differently.
The designer of the pack decides. If you are being super conservative, you call it full at 3.4V. Some BSL batteries do that (54.5V charge voltage on a 16-cell pack). If you want to squeeze the last 0.1% out of it, you call it full at 3.6V. The marketing team wants to put a larger number on the box. You meet somewhere in the middle.
The cell is full as soon as it reaches the charge voltage. It is not like lead acid that has to spend a few hours bubbling away to fully reverse the chemical reaction. But in a series pack, your cells don’t reach that voltage at the same time, so it takes a little while for the balancer to pull the lower cells up to the same voltage. How long this takes, depends entirely on how well balanced the battery is, but typically an hour or two is all you need. Once the battery is balanced once, it goes faster every time.
It’s a good question.
I reckon the reason we don’t ask this question more often is because the BMS is doing this for us.
So as long as your BMS is setup for your cells (and even better if the battery has been manufactured with BMS) then no need to get out your multimeter.
The battery bank is a custom made 16S using 105Ah LiFePo4 cells, The BMS is JBD 16S, 100A (similar to the overkill solar one).
Yes, as @plonkster said once the system is running in ESS mode (the same as in my case), you do not need to worry about float voltage.
My question was about the situation when the chargers in the system are running standalone (for example in Victron case, when no GX is present) - especially is there a point to configure the chargers (Multi and MPPT) with the absorption voltage = Float voltage, or it is best to keep the float lower. We are talking about LiFePo4 in particular.
Victron lithium “standalone” settings is to absorb at 14.2V (3.55V per cell) and to drop to 13.5V (3.375V per cell) as a “float” voltage. Letting the cells sit slightly lower is always a good idea for longevity.
I think, though I am not sure, that the balancer only works at the higher “absorption” voltage, so this is also a way to give the battery a well defined balancing (which bleeds off some energy and sacrifices efficiency) before settling to a lower point where the balancer is off.
But, stick with the spec sheet. Best thing to do. If you like tinkering and doing your own thing, I’d go for 3.55V absorb, 3.4V float. That should be fine.
Does the new Pylon algorithm for the GX go to float after getting to 52.4? I see mine doesn’t move until I start using battery again in the early evening.
No. All it does is avoid high cell alarms by dimming the charge voltage if a cell gets too close to 3.5V. It still aims for a static voltage somewhere around 52.5V (in practice it will even out around 52.3V).