I decided to do a capacity test on the new A123 System battery using the SBMS120 so what I did was to disconnect all PV panels about two days ago when the battery was fully charged and reset the energy counters and then just wait until the battery is empty and that happened today around 14:30 so still early enough to recharge the battery enough for what is needed tonight.
I took a few photos before I got to 0% SOC end even when at 0% I needed to wait a bit since I defined the battery as 180Ah in parameter settings but as you can see in last photo the usable capacity is slightly higher at around 188Ah (4.9kWh) and that had the lowest cell still around 3V under moderate load 35A
The default low voltage disconnect is set at 2.8V but I did not wanted to go there and there will not have been much extra energy in there anyway.
It is an extremely decent battery and love the very low internal resistance is just not something I can recommend to others because of the amount of work to build a pack out of this cells and the fact that they are from unknown sources (you can not just buy new cells from A123 so you get this not that great looking cells from unknown sources).
Time for Battery capacity test.
There is about 1 year since I installed my A123 System battery that powers my entire offgrid house.
The photos are in the order they where taken so I started in the morning charging the battery even added a few more panels on PV2 to charge the battery faster and after the first time it went to 100% fully charged I disconected the panels on PV2 so charge current is lower and added a 25A DC load just to spped the discharge a bit so it will start charging again at around 10 to 12A then one more time before starting the discharge test.
My settings are 3.50V Over voltage so when the highest cell gets there the charging is stopped and End of Charge is set at 3.48V thus the charge re-enable voltage will be 130mV lower (fixed setting inside the ISL94203) so highest cell needed to drop to 3.35V before charging will start again as it seems there is about 2Ah between fully charged and when the charging is re-enabled thus for this battery just a bit over 1% SOC
Now battery is fully charged indication is 99% and PV are physically disconnected from SBMS120 so no more charging so test is started by resting the energy counters.
Now you can see how battery voltage will progress with highest cell voltage and delta to lowest cell as follow:
99% -- 3.458V -- 10mV (1.3A)
89% -- 3.288V -- 7mV (88A)
80% -- 3.286V -- 10mV (87A)
80% -- 3.303V -- 4mV (28A) (this is a few seconds later from above).
70% -- 3.263V -- 12mV (87A)
60% -- 3.263V -- 7mV (50A)
50% -- 3.257V -- 7mV (50A)
39% -- 3.249V -- 7mV (50A)
30% -- 3.230V -- 7mV (50A)
20% -- 3.205V -- 8mV (50A)
10% -- 3.170V -- 9mV (50A)
1% -- 3.108V -- 12mV (50A)
0% -- 2.998V -- 34mV (31A) (point where the test was stopped).
The entire discharge test took just a little bit less than 4h so average discharge current was around 48A (0.25C) with peak at 88A (0.5C) thus absolutely no trouble for this battery.
Now while at 3.5V this battery was actually fully charged (there is no capacity gain to go to 3.6V) on the lower end I stopped the discharge at about 3V so there was still something down to 2.8V but not much and official tests go down to 2.5V but again not much extra capacity to be had going there as voltage will drop fairly fast below 3V
That test was done a bit differently as after the cells where fully charged I disconnected the PV and continued normal house discharge usage and so the discharge test was about 2 days instead of just 4h for this test so average load was a bit lower so that test had a bit of an advantage over this one but not much.
If I ignore the difference in test procedure and possible small measurement tolerances and just compare the numbers then that test made a year ago resulted in a discharge capacity of 187.89Ah vs this test made today that had resulted in 186.22Ah then delta in capacity is 187.89Ah / 186.22Ah = 0.9% delta
so less than 1% degradation and even if that is true that is an excellent number as that will mean a battery life of 20+ years to get to 20% degradation from original capacity.
Now on last photo you can see how much was the battery used and what my house used.
So over 8544h (that is about 356 days) the house used a total of 1122.349kWh + 373.944kWh = 1496.293kWh that is an average of 4.2kWh/day or 126kWh/month
Out of all that 806.714kWh went trough battery so in average day 2.27kWh went out of the battery and that is about 47% of the battery capacity that is around 4.8kWh
There are of course multiple small cycles each day but it will be equivalent with 356 cycles at 47% DOD (very rough approximation). The idea is that using the battery this way makes no impact on battery degradation and all the degradation is related to battery calendar aging degradation (small chemical reactions inside the battery).
The largest impact by far on battery calendar degradation is battery temperature and as an approximation each 10C higher temperature doubles the rate of degradation.
My battery is always at a nice +18C to +22C for about half the year (winter) and the other half is at +22C to +26C so ideal temperature.
So battery temperature (that for this LiFePO4 is basically the same as ambient as they do not generate almost any self heat in typical offgid usage) is the most important factor in battery life and cycle is completely irrelevant for this type of cells.
There is quite a bit of detail in here so that in a year time I can do this test almost the same in order to remove as many variable as possible.