Here, hold my beer ... have you tried this before?

150ah cells are out, 12 x 100ah 2nd life cells with 4 x 150ah cells in the temporary bank.

So I thought, let me charge each cell, they came out at 95% SOC, as per BMS.

Then using an EV Peak, I charge them each individually up to 3.65v

Most cells start charging at 3.2amps, a couple at 4.2amps. and they take hours to recharge!

Why!?

So methinks - don’t know HOW I got to this idea, but I did - let’s push the envelope a bit.

Set inverter like so:
image

Then I set ESS to this … and I waited …
image

… and waited …

Keeping an eye on it for HOURS …

… waiting …

Till it got here:
image

That was on 4 x 150ah cells fully recharged using EV Peak … it took out what I said it could, but nowhere did I dare go to below ±2.9v volts per cell. nope, not is a case of beer see.

Initial Conclusion: BMS SOC’s and shunt is useless out of the box.

I’m so going to follow @Phil.g00 's instructions and calibrate the BMS to the BMV … IF the cell volts behave that is.

Here is the hold my beer part … I did same on the main system, also set it to 0 SOC …
Started here:

Ended here … bet the 2nd life cells were fine …

But the sun took over …
image

Conclusion: Do NOT use cells that are not A-Grade, matched A-Grade cells that is.

Must add, these 150ah cells, they are good for camping, but not a damn for a commissioned grid-tied solar system.

I’m charging each one individually, then in batches of 4, I then see how the individual cells behave when the 500va feeds back into the house at night, recharging daytime on a schedule.

What, some may think, a grid-tied Multiplus 500va!?

The main system throttles back so nothing goes out to the street.

And a MP does stop feeding back if Eskom goes off, as the MP is not on the same circuit as what the man system feeds, jip, I tried that. :wink:

Cause see, why would one have a lithium battery bank camping system and NOT use it 24/7/365? :wink:

But I’m contradicting myself, 150ah cells grid tied camping system … yes, they discharge at 150w and recharge at 12v 16amps i.e. the cells are idling, not like the 30–60amps on a 48v system.

I’ve had a cell down to below 2.6V. The BMS didn’t stop me. It seems to have recovered fine, and the manufacturer did not seem perturbed when I reported it. Cannot mention names. But what I can say is that 2.9V doesn’t really imply any real danger yet.

Forgive me for clarifying this but I presume these are LiFePO4 cells (which are the current flavour of the month)
And their nominal voltage is 3.4V per cell…

Jip, Lifepo4, 2.8v the min according to the specs.

1 Like

I have a question for the guru’s here, @Phil.g00 @Louisvdw @JacoDeJongh @Gman and I’m told @Stanley

Methinks, this is where the “rubber hits the tar”, where most brand names have done their “research” on what settings to use …

My understanding is that at 2.8v per cell, the cells are properly empty, Zero SOC.

The settings of the BMS:
What grabs my attention is the 3.1v for 20% SOC.

Now, having removed the 150ah cells, recharged each one individually with an EV-Peak charger, matching similar cells together, I now have nice bottom and top balancing cells.

The Question:
For the LIFE of me, I cannot fathom why 16% BMS SOC can be at 13.13v or 3.28v per cell.
Ps, FWIW, cells 1 and 2 are the “problem” cells on the old 48v bank. Interesting.

Ok, but what were the cell voltages under load?
The lowest it went was 3.24v … 18% SOC.

On the 48v system, I see similar.
Ps. You can see the 2nd Life 100ah vs 150ah cells. :slight_smile:

BMS vs BMV:
Have not calibrated the BMV/BMS … and took the snap a bit later, but they are “closer”.
image

FWIW, before, as a test, I ran both systems down to an ESS Min SOC of 0. BMS giving the SOC. First-time it ran for HOURS on a Zero SOC. During those tests, I got closer to 3.1v i.e. that elusive 20% SOC … with a ton of warnings from Victron. So many that I had to switch off the warnings. :laughing:

So it seems to me, setting Min ESS SOC of Zero = 20, 3.1v.

Inverter settings, both same: Too many warnings, cannot get here. :slight_smile:
image

What am I not “getting”?

Cause the BMS looks at both voltage AND it counts amp-hours with a shunt/hall sensor.

SOC is always an estimate. At the edges you know for sure, in the middle it is always a bit of a guess.

Manufacturers also pretty much decide what voltage they consider full or empty. You can push higher or lower, but that generally shortens the life of the battery, so most of them will use 2.8V as the 0% mark (although you can go lower), and 3.6V or thereabouts as 100% (although, again, you can go higher, but you gain very little).

So my advice is: Stop worrying about it so much :slight_smile:

Edit: Or let me put it this way. If the cell is at 3.4V, it can be anything from 50% to 90% full depending on other factors. So the BMS will likely use pure amp-hour counting in this region. But a cell that is under 3.2V cannot possibly be more than 50% full, so the BMS will in all likelihood use the voltage as a hint to make a lower guess, but it is still somewhat dependent on amp-hour counting.

I’ve seen batteries happilly run at 0% for an hour, because the amp-hour counting reached zero before the voltage dropped out…

1 Like

The BMS (and also the BMV) calculates SOC by integrating the current. It has been told the total AH of the cells and maybe a few voltages to make corrections, such as at 3.5V it must be 100% full etc. Now in practice new cells typically have a higher AH capacity than the manufacturer states, and also to some extent will have a higher AH rating under low load than high load. So on a 150AH battery, the BMS will report 0%SOC once you have used 150AH even if the cells are actually not empty.

2 Likes

I agree with plokster and stanley. Referring to the image from your BMS settings with those cell voltages, the thing to remember is these are cell settings and not battery settings. I think you are getting confused and comparing apples with pears.

The cell voltages ARE NOT the the same as the battery values. You cannot look at one cell and then say the battery SOC should be this. This is why a BMS will always be more acurate than a BMV.

Scenario:
So your BMS and BMV will both measure the current flow. If you have a 100Ah 4cell battery both will say you have used 50% when 50Ah has been used from full. Easy.

But now one of your 4 cell’s is a bit out. While the other 3 have cell votages between the 40%-60%(3.3V-3.35V), the other cell has a voltage of 3.2V which is below the 40% cell setting. The BMV is clueless and will continue saying 50%SOC, while the BMS has more data to work with so will calculate the SOC as 50%-(10%*1/4)=47.5%SOC {Note this is a fictional calculation - I am not saying your BMS will subtract the 10% difference divided by the amount of cells under that difference}. The BMS have now used your cell settings to influence the battery SOC calculation.

You need know the formula and all the parameter values your BMS use to calculate your battery SOC if you want to compare values.

1 Like

What trigger this all was the fact that I have, had a 150ah bank which at 3.4v x 16 = 54.4v x 150ah = ±8.16kwh bank yet on “average” I never even got close to 6kwh per night out of it … when the cells were still happily in line.

AHA, now THAT makes sense yes!

But that is assuming one has programmed the BMS to match the cells, right?
Like a BMV shunt matched to a BMS shunt ito current …

How does ±5 hours sound on a 12v bank running at zero SOC as per the BMS?
Only got to like the lowest cell of 3.1v in the end. :laughing:
Working with brand names vs wide-open door with a ton of settings, I don’t think we are comparing apples with apples.

In chatting to @Gman , methinks the fact that one can calibrate the current as well as the volts per cell, based on the manufacturer’s specs for the cells, taking into account actual measurements per cell, opens a lot of doors to get more accurate readings from the BMS for the cells one has, which in turn makes it a better match like to say a Victron or an MLT inverter.

The brand names have done this exercise painstakingly, to not only match the inverter brands better, but to also to protect their cells for the warranty period.

To just “leave it be, stop worrying”, which I’m not, is not the answer I’m looking for. I think a bit of deeper thought would be better to match these specific Jaibaida BMS’es better with the cells, especially the new 280ah cells, arriving soon.

Methinks we need to find the ideal settings for the cells we use … like when the 280ah cells are installed.

When THAT happens, this whole discussion is moot. Cells MUST be in balance ALL the time.

Seeing as I have personal experiences with cell voltages out and a SOC that is nowhere close to accurate … so that lesson has been paid for dearly. :wink:

As a matter of fact, the 16 x 150ah cells, not ONE was properly charged since the cells went out of whack. Took many many days, each cell done individually, to get them all fully charged again.

Bad bad experience.

Here some more reading info on this platform : LiFePO4 Voltage Chart? | DIY Solar Power Forum

Here is a link to some more info : LiFePO4 Battery Voltage - Google Sheets

I spotted on TTT post that on the Capacity parameter you can add info at the % Voltage. Same with my Bms but mine is in increments of 10% 20% 30% and so on till 100%. I think this helps the BMS to work out SOC per cell that at the end of the day calculated all the sells together to get the SOC. Some of the BMS units got a setting that adjust the BMS settings to what cell chemistry you using.

This is also good reading for all interested parties. See page 14.

The Orion BMS calculates a battery pack’s state of charge (SOC) primarily by coulomb counting, or by keeping track of how much current has entered or left the battery pack. This method requires the use of a current sensor and generally tracks the state of charge of the battery pack quite well provided that the capacity of the battery is known and the current sensor is accurate. While coulomb counting is an accu-rate method, there are several things that can cause this calculation to become inaccurate. These things include inaccurate current sensors, cells with a different capacity than expected (e.g.from low temperature or weak cells), or the BMS memory being reset or reprogrammed.

To deal with these issues, the BMS has an SOC correction algorithm which compares measured open circuit cell voltages to known state of charge points. These points are called “drift points” and are pro-grammed into the BMS when it is setup. Drift points are specific voltages that are known to correlate to a specific state of charge and will vary from chemistry to chemistry. If the open circuit cell voltage is measured to be at one of these specific “drift points,” the BMS knows what the state of charge of the battery is supposed to be. In the event that the BMS’s calculated state of charge is higher or lower at one of these points, the BMS will adjust the calculated state of charge to the correct value.

Drift points are usually selected at locations along the cell’s discharge graph where the cell’s state of charge is obvious in a manner to avoid drifting incorrectly. For iron-phosphate cells, this means that re-ally only the upper 10-15% and lower 10-15% of the cell can be used for drift points due to the flat shape of the discharge curve. For other chemistries, additional points throughout the full range of state of charge may be possible, improving the accuracy of the drifting.

1 Like

Oh I once had a BYD battery that got confused about its SOC, and reset itself to 50% (even though it was empty at the time). So during the recharge it reached 99% very quickly, but then (because the cell voltages was too low) the algorithm would not allow it to go to 100%, so the battery literally sat at 99% for hours while charging full speed…

SOC is always an estimate. Especially between 15% SOC and 85% SOC.

Jip, I’ve seen that too.

EDIT: So far I’ve seen all the shiite that can go wrong … I own that bottle of Johnnie “Walker”, having walked right next to him drinking the “cool-aid” he “sells”.

FWIW, once I recharged all the 150ah cells one by one, I now see a 0.003 - 0.0.12v difference when charging/discharging using the 12v system … this batch, using the “faulty” cells.

Lesson learnt here:
If one’s cells go out of whack, decommission the bank, put in a 2nd hand lead acid bank or what what … and take the time to recharge the cells one by one.

My “Test Bench”:
Step 1: Using inverter with “The Bulb” (help BMS balance faster) for the bulk charge to +95% SOC.
Step 2: Switch off the inverter, disable the BMS balancing, top-up each cell with the EV-Peak.
Step3: Run the cells, grid-tied, for a few days to make sure the cells are “matched” i.e they keep in “sync” all the time - 100% - 20% - 100% - 20% over and over, watching charge/discharge cell voltages. The 100% is where the cells “lose” it … so I get that “fixed”, the top balancing.

The iPad to see what is going on, program BMS.

So worth it to have a Bluetooth AND an RS485 port on the same BMS.

I used this when I configured my Ant BMS.

Thank you @GVS, you and @Gman both sent me the same info, I know Gman is quite particular with his BMS and cells too.

I’ve set the 12v BMS SOC as per the Volts indicated … let’s see how this goes.

FWIW, what I’m looking at, drawing 150w from the 12v 150ah bank, I want to see how long the batteries run consistently down to 20% BMS SOC, with the cells staying in balance being charged/discharged.

Thinking being, the kWh is the total summation of ALL the parameters involved in discharging a battery bank.

Just make sure the BMS knows the theoretical capacity of the bank. i.e. You should be able to set the AH capacity to 150 in the BMS settings somewhere.

First thing I do yes.