Battery degradation over time. What is acceptable?

When my system was installed, the batteries were Revov’s (an early model, not like the ones advertised now). Then last year I noticed what looked like a drop in performance of the batteries and made contact with Revov with the installer copied in. Long story short: I now have a Freedom Won 10/8.

Revov did various things including taking the battery away for testing. It passed, but read their warranty. They also said the settings were wrong (installer had followed their instructions, and I know better than to fiddle with that stuff). They started saying that the batteries had been charged at over 125A. This was a problem because my inverter can’t charge at that rate. It then emerged that they had swopped the BMS out. In fact over the 3.5 years I had those batteries, there were 3 BMS. At one point they showed me log entries that suggested the battery had been run with incorrect settings, but those log entries predated the installation… so who knows what useful information was being provided by that particular BMS.

Anyway… in all the disputes and toing and froing, the original question went unanswered.

I had mailed Revov two graphs from SEMS portal. Pointed out what looked like a drop in performance, and asked if this was normal. No pressure. No aggressive/sarcastic wording. If they’d had said yes, that’s about normal, I’d probably have chalked it up as a lesson learned, but they never got around to answering it.

So… just so I have one less thing to worry about at night, maybe somebody here can voice an opinion.

Here’s a graph from May 2020. The batteries have been at my property for about 11 months.

Notice the sudden kick up in load (yellow trace) at 4:30. That’s my heat pump. SOC at 4:30 is 60%. SOC (green trace) when the heat pump is done SOC is 53% and the drop in SOC is linear. 7% drop in SOC whilst the heat pump runs.

Now look at the second graph.
image
This is from May 2022. Heat pump now kicks in later because I’m working from home. SOC when the pump starts is 53%, No other heavy loads - just the pump and the fridges ticking over (as they do all night). SOC now drops very quickly and after 10 minutes it is 36% - 17% already and the pump is still running (and the system is starting to get some help from PV & grid).

So, I know that battery performance drops with time. I remind you of my original question: Is this it? Is this usual degradation?

I didn’t get an answer at the time. I would value opinions now.

PS: Hats off to my installer who couldn’t control Revov, but did interact with them on my behalf, and eventually arranged the deal whereby I ended up with the Freedom Won.

1 Like

Not related, just a titbit to share for fun …

I did that once, the ±125a charge, seeing as I have 280ah cells and the BMS can take 300a or whatnot.

Set the inverter to recharge the batts at max it can do using Eskom, the 70a.
With the MPPTs making up the rest.

It worked! I hit that ±125a and was quite impressed with myself too.
Now know I can recharge “moerse vinning”. :grin:

How did that happen? The details, for this is seriously impressive. Cudos to the installer!

I don’t feel at liberty to disclose details, but he did say to me that whilst his warranty is limited, he tries to never leave a customer (new or not so new) in the lurch.

2 Likes

plenty variables:
My heatpump pulls more power to reach 56deg vs 53deg (the pump needs to obtain higher pressure) so any change in temperature requirement will have a difference in consumption.
If you upped the temp by one or two degrees this could easily explain it. Then there is also the temp of the geyser and the amount of water, every degree represents more power that will be needed.

My experience is simply from running my own DIY battery (second life cells).
A battery’s capacity is from the highest cell peaking (3.6V could also be the smallest capacity cell) to the lowest cell reaching cutoff (2.7V potentially not fully charged when the highest cell peaked).
The SOC is hopefully measured in this case via a smart shunt?
What is the capacity of your battery? If it is 5KWh then the difference between 7% (335Wh) and 17% (850Wh) or perhaps 10KWh? then double those numbers.

If I had to guess I would say that a badly balanced pack can easily calibrate the capacity to justify the “degredation” that you are seeing). The BMS is probably not able to balance the cells and you are only able to use the portion between the highest cell peaking and the lowest cell cutting off which could easily be 20-30% less than the battery that was top balanced originally.

Definitely the temp heating to didn’t increase. Been 55 for years. OK… Starting temp could vary. But for about a year those batteries shouldered that early morning run. Time changed because of work from home. When I had to return and set it back to 5 am start, you could really see that sharp drop.

And that, to me, was the concern. If the pump ran longer you’d see the same rate of SOC drop, just for longer, yes?

Though run time didn’t change because in the morning the timer limits it to one hour.

I still lean towards a no answer

If the temperature outside is colder the pump will run at a higher pressure to achieve the output temperature required and also run for longer to extract the heat as there is less energy to extract.

There are simply too many variables,
Water temp (geyser)
Ambient temp (season)
It makes for a difficult argument trying to prove degradation when the inputs are variable

If you want to test the capacity rather take a known load like a kettle/geyser element for a fixed time with no other loads. Then calc the percentage drop back to WH and see if it corresponds with your battery overall capacity.

I still think that the most likely explanation is poorly balanced cells. I have a Daly BMS and second life BYD cells. I constantly have pains with the calibration resetting itself (to 100%) and one cell not getting a full charge. I probably need to invest in a cell balancer and a victron smart shunt.

I think 7-10% degradation over 3 years is normal for new cells but unbalanced cells could easily appear as 20% (or more) degradation. If one cell only receives a 80% charge then the whole pack will appear as 20% degradation.

Truth be told I think Bobster and his installer did very well with a FreedomWon replacement as serious doubt has entered the equation and that will 'eat one up" going forward.

What good is coming from this though is how to actually determine whether a bank has degraded and ideas on what parameters one should use to determine that, or is it as simple as the cells are out of balance?

Cells out of balance are a HUGE consideration. A BMS must have an app where one can see each cell’s voltages, min/max is not enough, and the general overall data. You know what, it must give you all the data the warranty is based on. This thing that it is a very limited view if any, or worse, “supplier only”, has to change. That is like running blind in a snowstorm.

This idea has been in my mind since years ago … if the Delta is exceeding like 0.1v (user setting), throttle the charge amps/lower the volts, to give the BMS more time to balance better.

As the years move on, collectively we are learning a ton about Lifepo4 banks and their quirks.

And they do have quirks … as awesome as they are.

1 Like

@TheTerribleTriplet will agree with me here, but data is king. If you can pull data from your bms to see what the cells are doing, then you can pint point out the problems.

So check if you can get a cable to get on to the BMS and see what the problem is.

SNAP!!! :rofl:

This is what I’m talking about … new bank, to get it balanced using the Delta.

Left is the Delta hitting nearly 0.3v originally, the bank was top-balanced Then over time, by altering the Amps/Volts of the inverter, all automated, one can help the BMS to get in control, and stay in control, faster.

I don’t use this yet, run “driverless” trusting the cells and Victron algorithms to keep things in line. But if I see the cells getting older, I will implement this in my system. Currently, I end up with a Delta of 0.004v at 100% SOC once all is said and done.

All manufacturers could use this concept … I will accept royalties. :wink:

1 Like

I agree. But I’ll also point out that if you look at the two screen shots, there is not a big difference in load.

If the load had been noticably higher, I’d have not have troubled Revov. I’d have tried to understand why the load had increased.

Yes. But also there is a question mark hanging over this whole business: Is that drop in performance that I observed usual for a 3 year old battery? As I think I said, if the installer or Revov told me that this was in line with what they see at other sites, I’d have marked it down to experience.

So somewhere in the back of mind, pretty quiet, but not gone, is the thought that maybe the FW will behave the same way in 3 years time.

So the real question remains unanswered.

Do they have wot, 3000/5000/6000 cycles?
3 years = 1095 cycle, so on 3k cycles, one third down.

Personally, I’m quite sure that the SOC, being partially volts driven, was out of whack due to a cell, or cells being out of balance. Have seen that myself a few times, SOC is out of whack because of volts.

Today I run a Smartshunt for my system’s SOC. The BMS shunt, at times during charging, is out by like 30%, especially when like 3-4kw going into the batts, due to volts. In the end, both get to the same “conclusion” at 100%.

1 Like

I had to find the original guarantee, as the batteries I had are nothing like Revov sell now and there’s no data about them on their Web site.

They promised 1 cycle per day for 10 years (they don’t define “cycle”). So 3650 cycles, thus one third down (though I had no cycle count from the BMS)

The first thing I would do is check delta SoC vs kWh over one cycle. The kWh info is available from VRM portal in fairly granular slots. Pretty sure the SoC will be as well. Then you calculate your own SoH for the batteries (current available Ah / original available Ah).

1 Like

You have an advantage over me here: You know what you’re doing :smile:

Anyway… batteries are gone now, but this is still useful information. And you’ve given a clue. I suspected that the problem was that the batteries and BMS were mismatched.

When the system was first installed, it was running with the inverter sensing the battery voltage and deciding what to do.

This setup was unstable. The batteries several times just shut down for no apparent reason. The first couple of times I thought “OK… maybe this is what solar life is about”. It was a problem because there was no way for an owner like me to start the batteries. Except to use a feature on the Goodwe that would kick start them (the manual tells you to stand well back because a big surge of power is going to be sent down the cables. The sort of wording you used to get on fireworks boxes).

BUT that feature only worked when there was sufficient PV. So if the batteries shut down late afternoon, there was nothing I could do except wait for the sun to come up (the kick start worked every time). But if there was load shedding that night, we were exposed. (by sheer fluke, this set of circumstances never occurred)

Eventually the failures became more frequent, I decided that life with PV has to be better than this, and queried the situation with the installer. They discussed the problem with Revov. The BMS was swopped out, and also another piece of equipment was introduced. The BMS had only an RS-485 comm port. So there is a box that can be bought that interfaces between 485 and CAN. This was put in place, and now there were comms between the BMS and inverter.

The system stability improved quite markedly once all this had been done.

And the BMS consistently (for over a year) reported SOH = 103%. That figure never changed.

Here’s a good example of an eTower top balancing over 5 days after being commissioned:

5 Days? That’s a slow balance.

Normally ballancing will take weeks, so 5 days is pretty fast.
On a 100Ah cell or more it can take a while.

1 Like