LifePO4 wear factor - Charge current vs. Loads current?

10 views
Skip to first unread message

Barry Timm

unread,
Dec 24, 2025, 2:07:55 PM (20 hours ago) Dec 24
to electrodacus
My understanding from past posts is that high charge currents into an LFP cell is a significant wear factor due to high internal heat generation, especially in consumer grade batteries without any cooling features.

I also thought I understood that Loads current is much less a wear factor than the equivalent charge current. Let's say a 1C load current is less of a wear factor on a specific battery than the same 1C charge current into that same battery.

Do I understand this correctly, and if so, cn anyone help me understand why the difference between the charge and load wear factors for the same current levels?

Many Thanks!

Barry.

Dacian Todea (electrodacus)

unread,
Dec 24, 2025, 2:56:49 PM (19 hours ago) Dec 24
to electrodacus
Barry,

There is no issue with either charge or discharge current rate in relation to degradation.
The degradation is dependent on temperature, number of cycles and calendar aging with charge discharge rate not having a significant impact directly.
Charging or discharging at 1C will have the same effect on temperature increase as internal resistance is the same in both cases.
If we are talking about solar then there is an ideal max charge and discharge rate but they have nothing to do directly with limiting degradation.
Ideal max charge rate is 0.2C and max discharge 0.4C and normally you will not have more than that in typical solar applications anyway.
 
The most popular cells by far are the EVE (or other brands) 280 to 320Ah cells and they have an internal impedance around 0.25mOhm (this is measured at 1kHz) but typical inverters will be at 100 to 120Hz and some loads will be pure DC and DC internal resistance is higher around 0.4 to 0.5mOhm. For charging it will be pure DC current assuming PV panels are used.
0.2C for this cells is around 60A and so at 0.5mOhm * 60A = 30mV internal voltage drop.
So 30mV * 60A = 1.8W of heat and this for the size cell even if in the middle of a pack will not contribute to any sort of significant temperature increase at most 2 to 3C above ambient thus cell degradation is not impacted.

If you double this charge current the amount of heat increases by a factor of 4 so 7.2W of internal heat and that can have significant impact on battery temperature. But 0.4C (120A for this cells) as Load is fine since most of the time the load is ON for short periods of times usually minutes instead of hours for charging.
Also 0.4C discharge if it was to be continues it will mean that battery will be fully discharged in just over 2 hours and that is not how typical solar system is used anyway.

So a few examples 
a) small 12V  4 of this cells in series system (~4kWh battery) ideal PV charging will be 60A * 12V = 720W so around 800 - 1000W of solar PV and a peak load of 1500W so a 2000W inverter will be fine.
b) 24V 8s this same type of cells (~8kWh) ideal PV charging around 1600W and Load around 3200W
c) 24V 8s2p (~16kWh) ideal PV charging 3200W and Load can be as high as 6000W (fairly hard to even find 24V inverters larger than this usually will be 2x 3000W inverters in parallel or series).
For larger battery than this 4p Load can be as high as 12kW and that is already at about the limit of what you can build on 24V system.

If you are thinking on a small super portable (lightweight) system that needs to handle high charge discharge rates then you will likely look at cells with lower internal impedance as there are cells much more capable but they will cost more.

The issue is also with the internal voltage drop. If 3.55V is considered fully charged that will be true for 0.2C on this EVE cells at 0.2C as there is only 0.03V internal drop. But at 1C this same cells will have 300A * 0.5mOhm = 150mV internal drop so BMS will think they are fully charged when they are not thus to get accurate SOC and properly charge this cells you will need to increase the limit to 3.65V as that will ensure they are in reality at 3.5V so fully charged or you will need a more complex two stage charging with CC then CV stage to properly charge the cells.
In any case degradation is not directly from charge rate but from the internal heating of the cells.

Barry Timm

unread,
Dec 24, 2025, 3:33:14 PM (19 hours ago) Dec 24
to electrodacus
Many thanks for the detailed reply and info. 
Yes, I was aware of the internal heat soak being the key wear factor and not current rate (directly), but the latter leads to the former (given that no specific cooling functions typically exist in consumer grade batteries).

I was not aware that the heat buildup was equivalent between charging and loads (for the same current) as I'd (incorrectly) understood from previous posts that charge current led to high(er) heat soak into the cells, whereas now I understand that this is not the case, and that the real difference is the likely usage patterns of loads vs charge currents.
OK.....that helps clarify for me, so thanks again!
Reply all
Reply to author
Forward
0 new messages