Stoaty
Well-known member
L1 and L2 are both slow charges with respect to their effect on battery life, so no difference.cgaydos said:New question: is there any research on whether battery life is affected by using level 1 vs level 2 charging?
L1 and L2 are both slow charges with respect to their effect on battery life, so no difference.cgaydos said:New question: is there any research on whether battery life is affected by using level 1 vs level 2 charging?
I'd have a very non-technical answer to that one. I'm going to charge to 100% so that I have the freedom to take a side trip or run the heater if I so desire. What good is having a car you have to baby so much that you can't enjoy life? For the record, I DO charge to 80% most of the time, but that gets me home with 3 or 4 bars. If I started getting home with less than 2 full bars, I'd switch to 100% charging.cgaydos said:... Suppose you have a trip scheduled that will use about 75% of the battery. Your options are to charge to 80% and then discharge down to 5% or charge to 100% and discharge down to 25%. For simplicity let's assume there is no option to recharge during this trip. ...
drees said:I'm not sure I agree with that (at least if you don't have to go anywhere). From what my research has shown, storage of lithium batteries at low SOC (even nearly completely empty - some Lithium chemistries can even be stored empty) is not detrimental, but discharge the pack at high rates when the SOC is low is harder on the pack.Stoaty said:I would say that avoiding going below LBW is a reasonable thing to strive for, but the main thing after going fairly low on charge is to wait an hour for the pack to cool down, then charge up to the sweet spot (30-40% SOC).
To maximize life, try to avoid high power draw as the battery gets very low. But don't necessarily worry about charging soon unless you need to go off on another drive before your next scheduled charge, or perhaps if you did hit turtle and don't expect to be charging for at least a couple days.
GregH said:I've wondered about this... We know the higher cell voltages cause reduced capacity over long periods of time, but to what end? Is it worse to spend 4 hours at full (4.1V) or 12 hours at 80% (4.05V)? Wouldn't 12 hours at 40% (3.75V) be A LOT better than either 4.1V or 4.05V? But how low can (should) you go? Is it really ok to let the car sit below VLBW (<3.6V)? VLBW at 5 or 6 Gids could be as little as 3.0V! Is that ok? VLBW at 24 Gids could still be 3.6V which doesn't sound that bad..
I'm talking about resting voltages.. discharge at low voltage is another issue. It seems logical that high discharges at low SOCs (when impedance is high) could be a bad thing, yet the car does nothing until the very end to discourage high discharge! On the top end they go overboard limiting regen so as never to exceed 4.1V but on the low end you could be at 3.2V and still have full power! Does this mean that high discharges at low SOCs aren't a bad thing? (other than obviously heating the pack due to impedance)..?
I tend to drive more and more mellow as the SOC gets lower.. I like to keep the pack voltage over 345V (3.6V) but if I do go VLBW I try to drive really soft even though the car is willing to give me full power.. I've let the car rest at 340-350V before charging thinking that the more time at lower SOC, the better... Is this incorrect?
Oh yes it does! You just have to remember that the driver is an important component of the system. That flashing GOM is a very effective motivator to conserve battery power so, I suspect like most drivers, I drive more gently below LBW and much more gently below VLBW.GregH said:It seems logical that high discharges at low SOCs (when impedance is high) could be a bad thing, yet the car does nothing until the very end to discourage high discharge!
There is no way there is more degradation at 80% SOC than 100% SOC for the Leaf battery pack! Nissan picked 80% for "long life mode". They limited the maximum SOC to 95% because 100% is even worse for the battery. Remember it is time and temperature at high SOC that is bad for the battery.RegGuheert said:Thanks for sharing that again, surfingslovak! That plot sure makes me think that storage at 80% is much worse than storage at 100%. This agrees with some other research data we have seen which indicated more degradation at 80% SOC than at 100%. It does make me wonder why Nissan chose that particular SOC as a charging option.
Absolutely! I'm not an electrochemist, although I know at least one, who is reading here frequently. Perhaps he will chime in one of these days. Seeing degradation for what it is, you would expect that there are efforts to mitigate such undesirable changes in battery composition, which in turn should help extend battery life. The addition of sulfur in the NEC paper ydnas7 originally referenced here on the forum would be a perfect example. Good on you to read the graph carefully. It's a bit surprising to see this type of data presentation in a scientific paper. I suppose that they count on engaging an attentive audience, which will catch such details?RegGuheert said:Personally, my preference would be for zero dissolution of manganese into the electrolyte. Do I get a vote?
Click to openThese newly developed batteries resolve durability issues by utilizing original organic sulfur compound additive agents to suppress solvent deterioration and to form a stable protective film on the electrode during charge/discharge cycles. According to baseline assessments (*3) of the newly developed electrolyte solution, the increase in resistance was reduced by more than half, lifespan increased by 1.5 to 3 times and capacity reduction from repeated recharging was significantly restrained.
Sorry, I missed this. Yes, the graph looks a bit suspicious above 80%. I don't have an explanation for that, and would expect more degradation close to 100% SOC as well. Let's see if I can find the source.Stoaty said:There is no way there is more degradation at 80% SOC than 100% SOC for the Leaf battery pack! Nissan picked 80% for "long life mode". They limited the maximum SOC to 95% because 100% is even worse for the battery. Remember it is time and temperature at high SOC that is bad for the battery.RegGuheert said:Thanks for sharing that again, surfingslovak! That plot sure makes me think that storage at 80% is much worse than storage at 100%. This agrees with some other research data we have seen which indicated more degradation at 80% SOC than at 100%. It does make me wonder why Nissan chose that particular SOC as a charging option.
I agree - one has to keep in mind that that chart is the result of a test at 60C - who knows how it applies to the cells we have in our LEAF specifically. For sure, Nissan wouldn't go through the trouble of creating a "long-life mode" unless it actually produces an improvement in calendar life.Stoaty said:There is no way there is more degradation at 80% SOC than 100% SOC for the Leaf battery pack! Nissan picked 80% for "long life mode". They limited the maximum SOC to 95% because 100% is even worse for the battery. Remember it is time and temperature at high SOC that is bad for the battery.
I'm sure that Nissan/NEC has the data to tell us, but unless someone else ponies up for a bunch of cells to do their own testing, we can only make educated guesses. For sure, it seems that in general storage around 40% SOC is a conservative/safe storage SOC - there doesn't seem to be significant benefits to storage at lower SOC. I know that there are a number of lithium chemistries which have no problem with storage at very low SOC (and calendar life is better), but I don't know for sure if that applies to the batteries in the LEAF. I have to think that Nissan protects us from putting the pack into any state where significant degradation may happen to the pack given that it's quite possible that one might turtle a car and not be able to charge it for some time.GregH said:I've wondered about this... We know the higher cell voltages cause reduced capacity over long periods of time, but to what end? Is it worse to spend 4 hours at full (4.1V) or 12 hours at 80% (4.05V)? Wouldn't 12 hours at 40% (3.75V) be A LOT better than either 4.1V or 4.05V? But how low can (should) you go? Is it really ok to let the car sit below VLBW (<3.6V)? VLBW at 5 or 6 Gids could be as little as 3.0V! Is that ok? VLBW at 24 Gids could still be 3.6V which doesn't sound that bad..
A cell has a different impedance for charging and discharging. Think of a battery as a sponge - at high SOC, the impedance to discharge is low - those electrons are ready to jump over to the cathode (hence the high voltage). At low SOC, the anode has lots of room for electrons so it's easy to charge it up.GregH said:I'm talking about resting voltages.. discharge at low voltage is another issue. It seems logical that high discharges at low SOCs (when impedance is high) could be a bad thing, yet the car does nothing until the very end to discourage high discharge! On the top end they go overboard limiting regen so as never to exceed 4.1V but on the low end you could be at 3.2V and still have full power! Does this mean that high discharges at low SOCs aren't a bad thing? (other than obviously heating the pack due to impedance)..?
I personally would not hesitate to let the car sit until the next timer cycle before charging again at VLBW, but I might be tempted to charge the battery to VLBW or LBW if I turtled it and wasn't planning on charging it for more than a day.GregH said:I tend to drive more and more mellow as the SOC gets lower.. I like to keep the pack voltage over 345V (3.6V) but if I do go VLBW I try to drive really soft even though the car is willing to give me full power.. I've let the car rest at 340-350V before charging thinking that the more time at lower SOC, the better... Is this incorrect?
There are other approaches, as well. Here is a paper from 2002 which eliminates the manganese dissolution by replacing the electrolyte with a salt called lithium bisoxalatoborate. Since we do not see such batteries in use, clearly there must be other issues with doing that, one of which is increased cell resistance.surfingslovak said:Absolutely! I'm not an electrochemist, although I know at least one, who is reading here frequently. Perhaps he will chime in one of these days. Seeing degradation for what it is, you would expect that there are efforts to mitigate such undesirable changes in battery composition, which in turn should help extend battery life. The addition of sulfur in the NEC paper ydnas7 originally referenced here on the forum would be a perfect example.RegGuheert said:Personally, my preference would be for zero dissolution of manganese into the electrolyte. Do I get a vote?
surfingslovak said:In terms of pack voltage, I believe that manganese dissolution into electrolyte is the main factor driving battery degradation at high SOC. The graph depicted below is from a research paper on LMO cells. It illustrates nicely, why storage at low SOC can be beneficial.
The in situ monitoring of Mn dissolution from the spinel LiMn2O4 electrode was carried out under various conditions. The ring currents exhibited maxima corresponding to the EOC and EOD, with the largest peak at the EOC. The results show that dissolution of Mn from spinel LiMn O occurs during charge/discharge cycling, specially in a charged state at 4.1 V and in a discharged state at 3.1 V. The larger peak at EOC demonstrated that Mn dissolution took place mainly at the top of charge. At elevated temperatures, the ring cathodic currents were also larger due to the increase of the Mn dissolution rate.
Wow! I completely misread that! Thanks!Nubo said:Note, the horizontal axis is NOT SOC. It is DOD (depth of discharge).
Enter your email address to join: