OK
Adjustment of the Discharge Stop SOC setting hasn't worked in quite the way I'd hoped but has provided a bit more of an insight into what's going on.
Yesterdays adjustment was simply to increase the Discharge Stop SOC from 10% to 12%.
The hope was that the SOC limit, which triggers force charge of the battery, was a
fixed level and that protecting an additional 2% on the battery charge would extend the time it took for the charge to fall to the trigger point. i.e. provide greater head room above the force charge point.
Unfortunately it appears that rather than being a fixed % value its a fixed % below whatever the DS SOC is set to.
So before the change the event sequence was;
Discharge stopped at 10%.
Battery would loose charge until the charge level reached 9%
Force charge would kick in and trickle charge at 300W
When the charge reached 13% and force charge would then stop.
With the DS SOC set to 12% the sequence is simply shifted up 2%, force charge triggers at 11% and charging stops stops at 15%.
Repeats as soon as charge level returns to 11%
Net effect is that the grid charge events continue to take place at roughly the same frequency.
It does raise a question however to any other LI battery users out there.
Is the loss of charge I'm seeing, ~ 1% over 4-6hrs at the ~10% charge level, normal or greater than would be expected?
There's no recorded drain on the battery from the recorded data so apparently it's just charge loss.
The other thing that comes out of the exercise is that if, as it appears, this is quite rightly a BMS sequence to protect the battery from being fully discharged the implementation is not optimised to running in load first priority.
As noted the battery gets charged to a level 2% above the discharge stop SOC.
As soon as the force charge stops the inverter sees the battery is above the DS SOC and immediately discharges the excess back to load.
The battery is quickly returned to the DS SOC level which is again only 1% above the force charge trigger.
That's not a very efficient 2% contribution to load. Watts in to Watts out makes it (relatively) pricey.
It would be much better to have the extra 2% as buffer from a force charge than convert back to power for the load. (needs a better algorithm)
I appreciate that as longer, sunnier days come along
this will probably be insignificant in the grand scheme of things but it's still a bit annoying.
Could be a bit more efficient and make winter days a tad more palatable.
I've looked in inverter advanced settings an there doesn't appear to be any other parameter(s) which would impact the BMS routine.