Why is 80% charging recommended?

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
cgaydos said:
New question: is there any research on whether battery life is affected by using level 1 vs level 2 charging?
L1 and L2 are both slow charges with respect to their effect on battery life, so no difference.
 
cgaydos said:
... Suppose you have a trip scheduled that will use about 75% of the battery. Your options are to charge to 80% and then discharge down to 5% or charge to 100% and discharge down to 25%. For simplicity let's assume there is no option to recharge during this trip. ...
I'd have a very non-technical answer to that one. I'm going to charge to 100% so that I have the freedom to take a side trip or run the heater if I so desire. What good is having a car you have to baby so much that you can't enjoy life? For the record, I DO charge to 80% most of the time, but that gets me home with 3 or 4 bars. If I started getting home with less than 2 full bars, I'd switch to 100% charging.
 
drees said:
Stoaty said:
I would say that avoiding going below LBW is a reasonable thing to strive for, but the main thing after going fairly low on charge is to wait an hour for the pack to cool down, then charge up to the sweet spot (30-40% SOC).
I'm not sure I agree with that (at least if you don't have to go anywhere). From what my research has shown, storage of lithium batteries at low SOC (even nearly completely empty - some Lithium chemistries can even be stored empty) is not detrimental, but discharge the pack at high rates when the SOC is low is harder on the pack.

To maximize life, try to avoid high power draw as the battery gets very low. But don't necessarily worry about charging soon unless you need to go off on another drive before your next scheduled charge, or perhaps if you did hit turtle and don't expect to be charging for at least a couple days.

I've wondered about this... We know the higher cell voltages cause reduced capacity over long periods of time, but to what end? Is it worse to spend 4 hours at full (4.1V) or 12 hours at 80% (4.05V)? Wouldn't 12 hours at 40% (3.75V) be A LOT better than either 4.1V or 4.05V? But how low can (should) you go? Is it really ok to let the car sit below VLBW (<3.6V)? VLBW at 5 or 6 Gids could be as little as 3.0V! Is that ok? VLBW at 24 Gids could still be 3.6V which doesn't sound that bad..

I'm talking about resting voltages.. discharge at low voltage is another issue. It seems logical that high discharges at low SOCs (when impedance is high) could be a bad thing, yet the car does nothing until the very end to discourage high discharge! On the top end they go overboard limiting regen so as never to exceed 4.1V but on the low end you could be at 3.2V and still have full power! Does this mean that high discharges at low SOCs aren't a bad thing? (other than obviously heating the pack due to impedance)..?

I tend to drive more and more mellow as the SOC gets lower.. I like to keep the pack voltage over 345V (3.6V) but if I do go VLBW I try to drive really soft even though the car is willing to give me full power.. I've let the car rest at 340-350V before charging thinking that the more time at lower SOC, the better... Is this incorrect?
 
GregH said:
I've wondered about this... We know the higher cell voltages cause reduced capacity over long periods of time, but to what end? Is it worse to spend 4 hours at full (4.1V) or 12 hours at 80% (4.05V)? Wouldn't 12 hours at 40% (3.75V) be A LOT better than either 4.1V or 4.05V? But how low can (should) you go? Is it really ok to let the car sit below VLBW (<3.6V)? VLBW at 5 or 6 Gids could be as little as 3.0V! Is that ok? VLBW at 24 Gids could still be 3.6V which doesn't sound that bad..

I'm talking about resting voltages.. discharge at low voltage is another issue. It seems logical that high discharges at low SOCs (when impedance is high) could be a bad thing, yet the car does nothing until the very end to discourage high discharge! On the top end they go overboard limiting regen so as never to exceed 4.1V but on the low end you could be at 3.2V and still have full power! Does this mean that high discharges at low SOCs aren't a bad thing? (other than obviously heating the pack due to impedance)..?

I tend to drive more and more mellow as the SOC gets lower.. I like to keep the pack voltage over 345V (3.6V) but if I do go VLBW I try to drive really soft even though the car is willing to give me full power.. I've let the car rest at 340-350V before charging thinking that the more time at lower SOC, the better... Is this incorrect?

Well, the best thing is to keep it in a walk-in cooler, at about 40-60% SOC. But at some point there are diminishing returns. Personally, if it's between LBW and 80% I'm not gonna' worry, and if I need/want 100% I'll go there though usually timed to within a few hours of departure. Normal 80% is set to an end-timer as well. That's about as far as I am willing to babsysit the battery and I think it gets the low-hanging fruit. I might be persuaded to put some active ventilation in the garage to get evening and nighttime temps lower sooner in the summer heat.
 
GregH said:
It seems logical that high discharges at low SOCs (when impedance is high) could be a bad thing, yet the car does nothing until the very end to discourage high discharge!
Oh yes it does! You just have to remember that the driver is an important component of the system. That flashing GOM is a very effective motivator to conserve battery power so, I suspect like most drivers, I drive more gently below LBW and much more gently below VLBW.

Ray
 
batteryproblemmnl
Good point, Ray. I believe Greg said in one of the prior conversations that he would like to see drivetrain power being limited much earlier than it currently is in the LEAF.

I think that from an engineering perspective, this could make a lot of sense. For example, when the low battery warning chimes, motor power could be limited by 20% or 30%. There should be still plenty of power to drive the vehicle safely, but it wouldn't be possible to do jackrabbit starts anymore. That should be on a short list of things to avoid at low SOC anyway.

The board computer could help here to guide the driver a bit. For example, power rings could retract and decrease gradually from VLB all the way to turtle mode. I'm not sure if this is something Nissan has considered, and why it would have been rejected, if they thought about it.

That said, there is a similar discussion in this thread, and several others. In terms of pack voltage, I believe that manganese dissolution into electrolyte is the main factor driving battery degradation at high SOC. The graph depicted below is from a research paper on LMO cells. It illustrates nicely, why storage at low SOC can be beneficial.


mnelectrolytedissolution
 
Thanks for sharing that again, surfingslovak! That plot sure makes me think that storage at 80% is much worse than storage at 100%. This agrees with some other research data we have seen which indicated more degradation at 80% SOC than at 100%. It does make me wonder why Nissan chose that particular SOC as a charging option.

That said, there is no data on that chart between 80% and 100%. Considering that in a LEAF 100% SOC is really only 94%, perhaps the truth is that 94% results in more manganese dissolution than 80%. There is not enough data in that chart to tell.

Finally I will point out that the graph is not zero-scaled. The dissolution at 80% is twice of what it is at 40%. But that doubling is for a given amount of time, so after twice that amount of time at that temperature, you might expect 4X the dissolution, etc.

Personally, my preference would be for zero dissolution of manganese into the electrolyte. Do I get a vote?
 
RegGuheert said:
Thanks for sharing that again, surfingslovak! That plot sure makes me think that storage at 80% is much worse than storage at 100%. This agrees with some other research data we have seen which indicated more degradation at 80% SOC than at 100%. It does make me wonder why Nissan chose that particular SOC as a charging option.
There is no way there is more degradation at 80% SOC than 100% SOC for the Leaf battery pack! Nissan picked 80% for "long life mode". They limited the maximum SOC to 95% because 100% is even worse for the battery. Remember it is time and temperature at high SOC that is bad for the battery.
 
RegGuheert said:
Personally, my preference would be for zero dissolution of manganese into the electrolyte. Do I get a vote?
Absolutely! I'm not an electrochemist, although I know at least one, who is reading here frequently. Perhaps he will chime in one of these days. Seeing degradation for what it is, you would expect that there are efforts to mitigate such undesirable changes in battery composition, which in turn should help extend battery life. The addition of sulfur in the NEC paper ydnas7 originally referenced here on the forum would be a perfect example. Good on you to read the graph carefully. It's a bit surprising to see this type of data presentation in a scientific paper. I suppose that they count on engaging an attentive audience, which will catch such details?

Click to open
 
Stoaty said:
RegGuheert said:
Thanks for sharing that again, surfingslovak! That plot sure makes me think that storage at 80% is much worse than storage at 100%. This agrees with some other research data we have seen which indicated more degradation at 80% SOC than at 100%. It does make me wonder why Nissan chose that particular SOC as a charging option.
There is no way there is more degradation at 80% SOC than 100% SOC for the Leaf battery pack! Nissan picked 80% for "long life mode". They limited the maximum SOC to 95% because 100% is even worse for the battery. Remember it is time and temperature at high SOC that is bad for the battery.
Sorry, I missed this. Yes, the graph looks a bit suspicious above 80%. I don't have an explanation for that, and would expect more degradation close to 100% SOC as well. Let's see if I can find the source.
 
Stoaty said:
There is no way there is more degradation at 80% SOC than 100% SOC for the Leaf battery pack! Nissan picked 80% for "long life mode". They limited the maximum SOC to 95% because 100% is even worse for the battery. Remember it is time and temperature at high SOC that is bad for the battery.
I agree - one has to keep in mind that that chart is the result of a test at 60C - who knows how it applies to the cells we have in our LEAF specifically. For sure, Nissan wouldn't go through the trouble of creating a "long-life mode" unless it actually produces an improvement in calendar life.

GregH said:
I've wondered about this... We know the higher cell voltages cause reduced capacity over long periods of time, but to what end? Is it worse to spend 4 hours at full (4.1V) or 12 hours at 80% (4.05V)? Wouldn't 12 hours at 40% (3.75V) be A LOT better than either 4.1V or 4.05V? But how low can (should) you go? Is it really ok to let the car sit below VLBW (<3.6V)? VLBW at 5 or 6 Gids could be as little as 3.0V! Is that ok? VLBW at 24 Gids could still be 3.6V which doesn't sound that bad..
I'm sure that Nissan/NEC has the data to tell us, but unless someone else ponies up for a bunch of cells to do their own testing, we can only make educated guesses. For sure, it seems that in general storage around 40% SOC is a conservative/safe storage SOC - there doesn't seem to be significant benefits to storage at lower SOC. I know that there are a number of lithium chemistries which have no problem with storage at very low SOC (and calendar life is better), but I don't know for sure if that applies to the batteries in the LEAF. I have to think that Nissan protects us from putting the pack into any state where significant degradation may happen to the pack given that it's quite possible that one might turtle a car and not be able to charge it for some time.

GregH said:
I'm talking about resting voltages.. discharge at low voltage is another issue. It seems logical that high discharges at low SOCs (when impedance is high) could be a bad thing, yet the car does nothing until the very end to discourage high discharge! On the top end they go overboard limiting regen so as never to exceed 4.1V but on the low end you could be at 3.2V and still have full power! Does this mean that high discharges at low SOCs aren't a bad thing? (other than obviously heating the pack due to impedance)..?
A cell has a different impedance for charging and discharging. Think of a battery as a sponge - at high SOC, the impedance to discharge is low - those electrons are ready to jump over to the cathode (hence the high voltage). At low SOC, the anode has lots of room for electrons so it's easy to charge it up.

GregH said:
I tend to drive more and more mellow as the SOC gets lower.. I like to keep the pack voltage over 345V (3.6V) but if I do go VLBW I try to drive really soft even though the car is willing to give me full power.. I've let the car rest at 340-350V before charging thinking that the more time at lower SOC, the better... Is this incorrect?
I personally would not hesitate to let the car sit until the next timer cycle before charging again at VLBW, but I might be tempted to charge the battery to VLBW or LBW if I turtled it and wasn't planning on charging it for more than a day.
 
surfingslovak said:
RegGuheert said:
Personally, my preference would be for zero dissolution of manganese into the electrolyte. Do I get a vote?
Absolutely! I'm not an electrochemist, although I know at least one, who is reading here frequently. Perhaps he will chime in one of these days. Seeing degradation for what it is, you would expect that there are efforts to mitigate such undesirable changes in battery composition, which in turn should help extend battery life. The addition of sulfur in the NEC paper ydnas7 originally referenced here on the forum would be a perfect example.
There are other approaches, as well. Here is a paper from 2002 which eliminates the manganese dissolution by replacing the electrolyte with a salt called lithium bisoxalatoborate. Since we do not see such batteries in use, clearly there must be other issues with doing that, one of which is increased cell resistance.

Another effort to do this comes from a company I have posted about previously called Leyden Energy. They replace the electrolyte with an imide salt. This salt apparently is aggressive to the aluminum current collector, which in turn is replaced by a carbon-based one. The result is a cell with very good high temperature characteristics, but apparently this one also suffers from high cell resistance, although they do not say. They also are working on a silicon-based anode to significantly increase energy density. But I see their website no longer mentions small EVs as an application, so I suspect the resistance must be pretty high. (They used to give battery specs and power was notably absent. Maybe cost is also an issue.

The point is that this problem is entirely solvable and has been for some time, but it seems the industry feels the medicine is worse than the disease.
 
surfingslovak said:
In terms of pack voltage, I believe that manganese dissolution into electrolyte is the main factor driving battery degradation at high SOC. The graph depicted below is from a research paper on LMO cells. It illustrates nicely, why storage at low SOC can be beneficial.


mnelectrolytedissolution


Note, the horizontal axis is NOT SOC. It is DOD (depth of discharge).
 
batteryproblemmnl
My apologies, I should have posted the source earlier. It's an abstract from the Journal of the Electrochemical Society. I thought I had the full paper as well, but no such luck. The DOD vs SOC question is valid, and I believe that it's something that was lost in translation.

A quick search yielded another paper co-sponsored by the Lawrence Livermore lab. I quote:

The in situ monitoring of Mn dissolution from the spinel LiMn2O4 electrode was carried out under various conditions. The ring currents exhibited maxima corresponding to the EOC and EOD, with the largest peak at the EOC. The results show that dissolution of Mn from spinel LiMn O occurs during charge/discharge cycling, specially in a charged state at 4.1 V and in a discharged state at 3.1 V. The larger peak at EOC demonstrated that Mn dissolution took place mainly at the top of charge. At elevated temperatures, the ring cathodic currents were also larger due to the increase of the Mn dissolution rate.
 
Nubo said:
Note, the horizontal axis is NOT SOC. It is DOD (depth of discharge).
Wow! I completely misread that! Thanks!

Based on that, I don't see a huge difference between 80% and 100%, but I'm pretty sure I don't want to store the LEAF down around LBW.
 
Back
Top