KtG
Posts: 5
Joined: Sun Sep 09, 2012 5:09 pm

Re: LEAF voltage measurement accuracy impact on capacity

Thu Sep 13, 2012 7:46 pm

RegGuheert wrote:
KtG wrote:The Battery controller has more varied data about voltage than it has about temperature. One can see the voltage of each module individually with Consult. This is good. We techs use this info not just for diagnosis, but for balancing a new module prior to installation. The voltage is displayed in mV.

There are 4 (if i remember right) temp probes. 2 on the rear bank, one for each side bank of modules. I'm pretty sure the battery controller can display them individually on Consult, though I've never looked.

I felt more temp sensors were needed the first time I saw the battery. But im a tech, not an engineer. I wish engineers asked for my input.
Thanks, again! That's good information!

Yes, I've noticed from reading the service manual that the cell voltages are displayed in millivolts. That much resolution implies and accurate measurement was made.

Hopefully Nissan made the system truly accurate rather than just deciding that high resolution would be sufficient to keep the pack balanced, which it certainly is.
Not that im disagreeing at all with your theory, but I've every confidence in the design of the voltage monitoring. I can even ID the physical locations of each module based on the info Consult gives.

The ability to control voltage in/out of each is limited though. Resolution there is by stack only, as each stack has its own bus ribbon that connects to the main relays. Short of putting a micro controller in each module (hello costs!), there isnt much of a way to get past that.

Still, with limited temp sensing its all a moot point. Even if the BMS saw a temp spike in one module and a corresponding voltage change, there isnt much it can do beside set a DTC.

I'd say its obvious they (the engineers, but more likely a bean counter) made a boo boo. But sometimes one makes a SWAG, because you can only add so many sensors before you muddy the data. I'd love AFR sensors on every cylinder of my gas engines (and aftermarket systems exist for that), but the cost vs gain is nutty. Same thing for having Intake air temp data in the port right by the valve rather than 3' up the air stream. Only development motors get that level of sensing, and only then when absolutely needed.

I'd like to think that Nissan had temp probes all over this battery in testing, and in production placed the 4 where they give the most accurate picture painted by the 100 sensors. Thats how I would have done it. but again, not an engineer. I'll try to make time to get the Consult on our Leaf Shuttle and see what I can see for individual temps. I'm willing to bet the rear stack is hottest, and is where the problem lies.

User avatar
Ingineer
Posts: 2741
Joined: Fri Oct 15, 2010 1:09 pm
Delivery Date: 13 Jul 2011
Leaf Number: 6969
Location: Berkeley, California
Contact: Website

Re: LEAF voltage measurement accuracy impact on capacity

Tue Sep 18, 2012 10:46 pm

I've not monitored my Leaf that often, but initially and as recently as yesterday (9/17/12) I did check the accuracy. According to my calibrated 100k count Fluke 45, it's still well within a half volt of the voltage reported by the Battery ECU/LBC (Lithium Battery Controller).

There is cross-checking going on in the LBC (module sums vs total pack), so I would imagine if something went awry you'd have a code P30F4. It's highly unlikely 2 different measurement systems could drift the same way so as to not be caught in self-diagnosis.

There's nothing to say that somehow the chemistry hasn't been altered in the pack in such a way to alter the knee voltages though.

-Phil
Easily Learn Electricity HERE! - - - - Website: http://evseupgrade.com/[/size] - - - - Like us on Facebook: EVSE Upgrade

User avatar
surfingslovak
Vendor
Posts: 3809
Joined: Mon Jun 13, 2011 1:35 pm

Re: LEAF voltage measurement accuracy impact on capacity

Tue Sep 18, 2012 11:03 pm

For what it's worth, I believe that we saw higher voltages after turtle mode in degraded Leafs than I would have expected. It was around 320V from what I have seen. My Leaf would typically hit turtle around 308V, but I did not let the car rest and measure later. Has anyone done that with a relatively new Leaf?

The three-bar-loser Leaf I drove in Phoenix (courtesy of wiltingleaf) behaved rather strangely after the low battery warning. It's tough to say if it was because the owner never runs a deep cycle and the software starts misjudging remaining battery capacity or if the discharge curve changes significantly as the battery degrades. I'm under the impression that Leaf owners in Phoenix get squeezed in whatever range they have left above the low battery warning, which many don't want to hit. Driving with flashing numbers or dashes, and without a Gid meter can be anxiety inducing. It appears that the range below the low battery warning expands with time, and the range above the warning diminishes rather quickly.

mdh
Posts: 122
Joined: Tue Aug 02, 2011 3:09 pm
Delivery Date: 10 Aug 2011

Re: LEAF voltage measurement accuracy impact on capacity

Wed Sep 19, 2012 12:43 am

surfingslovak wrote:For what it's worth, I believe that we saw higher voltages after turtle mode in degraded Leafs than I would have expected. It was around 320V from what I have seen. My Leaf would typically hit turtle around 308V, but I did not let the car rest and measure later. Has anyone done that with a relatively new Leaf?

The three-bar-loser Leaf I drove in Phoenix (courtesy of wiltingleaf) behaved rather strangely after the low battery warning. It's tough to say if it was because the owner never runs a deep cycle and the software starts misjudging remaining battery capacity or if the discharge curve changes significantly as the battery degrades. I'm under the impression that Leaf owners in Phoenix get squeezed in whatever range they have left above the low battery warning, which many don't want to hit. Driving with flashing numbers or dashes, and without a Gid meter can be anxiety inducing. It appears that the range below the low battery warning expands with time, and the range above the warning diminishes rather quickly.
I think this makes sense because the lower range becomes more delicate/vulnerable as the capacity degrades. In my mind, their TMS/BMS is heavily driven by machine learning and as you know... machine learning is all about training and retraining. It works great in retrospect, but struggles with fast moving training sets or in our case fast moving or unpredictable environmental swings. As such, it must use a set of guard bands because it has nothing else to save it from challenging conditions/situations. What do I know... I am likely over-thinking it.

Herm
Posts: 3765
Joined: Sun May 23, 2010 3:08 pm
Delivery Date: 29 Aug 2012
Location: Timbuktu, Mali

Re: LEAF voltage measurement accuracy impact on capacity

Wed Sep 19, 2012 12:45 am

palmermd wrote:If it were a simple voltmeter problem, why would it take them 2 months to let us know? A new voltmeter in the car is a very inexpensive solution for Nissan.
Probably not inexpensive at all since you have 48 voltage measurements to do, at a millivolt scale with lots of connectors, cables and so on.. and all this just to accurately access the little bit of energy left below the battery curve. Not a good investment. Running the car below the knee will always be risky and provide low returns.

Herm
Posts: 3765
Joined: Sun May 23, 2010 3:08 pm
Delivery Date: 29 Aug 2012
Location: Timbuktu, Mali

Re: LEAF voltage measurement accuracy impact on capacity

Wed Sep 19, 2012 12:46 am

mdh wrote: In my mind, their TMS/BMS is heavily driven by machine learning and as you know... machine learning is all about training and retraining. It works great in retrospect, but struggles with fast moving training sets or in our case fast moving or unpredictable environmental swings.
I think you are dead on..

User avatar
DaveEV
Forum Supporter
Posts: 6239
Joined: Fri Apr 23, 2010 3:51 pm
Location: San Diego

Re: LEAF voltage measurement accuracy impact on capacity

Wed Sep 19, 2012 2:06 am

surfingslovak wrote:For what it's worth, I believe that we saw higher voltages after turtle mode in degraded Leafs than I would have expected. It was around 320V from what I have seen. My Leaf would typically hit turtle around 308V, but I did not let the car rest and measure later. Has anyone done that with a relatively new Leaf?
That's interesting. 2 possible reasons I can think of:

1. Some modules are weaker than the others and the LEAF uses the voltage of the weakest module (or cell-pair?) to determine when the party ends.
2. With it being Arizona, the pack will be warmer. With a warm pack, voltage will sag less under load. Of course, the test was run in the morning during pretty normal temps that one can easily recreate in the Bay Area in the summer.
surfingslovak wrote:It's tough to say if it was because the owner never runs a deep cycle and the software starts misjudging remaining battery capacity or if the discharge curve changes significantly as the battery degrades. I'm under the impression that Leaf owners in Phoenix get squeezed in whatever range they have left above the low battery warning, which many don't want to hit. Driving with flashing numbers or dashes, and without a Gid meter can be anxiety inducing. It appears that the range below the low battery warning expands with time, and the range above the warning diminishes rather quickly.
If I had to guess, I'd guess that someone who rarely charges to 100% and rarely gets to LBW would make things really tough for the LBC to determine the actual health of the battery - just keeping the pack well above LBW should have a similar effect. Historically, lithium battery capacity monitors have needed to have had their battery drained down fairly far periodically to get a good reading of actual capacity thanks to the shallow discharge voltage curve typical of lithium batteries. I think we did see some evidence of the LBC learning on TickTock's car which immediately went back to 11 bars after the test (unfortunately in the wrong direction, but was expected).

The question is - how far down do you have to run the pack to give the LBC enough information to correct itself - and is doing it once enough? One hates to cycle the pack more than they really need to....

User avatar
RegGuheert
Posts: 6419
Joined: Mon Mar 19, 2012 4:12 am
Delivery Date: 16 Mar 2012
Leaf Number: 5926
Location: Northern VA

Re: LEAF voltage measurement accuracy impact on capacity

Wed Sep 19, 2012 2:18 am

Ingineer wrote:I've not monitored my Leaf that often, but initially and as recently as yesterday (9/17/12) I did check the accuracy. According to my calibrated 100k count Fluke 45, it's still well within a half volt of the voltage reported by the Battery ECU/LBC (Lithium Battery Controller).
I was hoping you would see this thread and report your findings! Thanks for the information!

So for your car it seems you are seeing accuracy of around 0.1% which should be sufficient to do the job. Hopefully all of the LEAFs are approximately this good. Given that the voltage difference between a 100% charge and an 80% charge is only 1.2%, not much error can be tolerated.

Have you ever tested another LEAF besides yours?

Also, have you ever tested the voltage accuracy with the pack temperature elevated? I'm just wondering if things could get worse in a climate like Phoenix.
Ingineer wrote:There is cross-checking going on in the LBC (module sums vs total pack), so I would imagine if something went awry you'd have a code P30F4. It's highly unlikely 2 different measurement systems could drift the same way so as to not be caught in self-diagnosis.
So these two systems are largely independent of each other? If so, that sounds like a good check, since it provides a way for the car to identify voltage measurement problems. It will be interesting to see if anyone ever gets that code.
Ingineer wrote:There's nothing to say that somehow the chemistry hasn't been altered in the pack in such a way to alter the knee voltages though.
Perhaps. Temperature is also likely part of it and perhaps pack balancing plays a role here, too.

I have a couple of related questions about voltage measurements that perhaps you or someone else can answer:

1) How does the LEAF make measurements during charging? There are really three questions/concerns here:
a) There is resistance in the pack wiring and also the internal resistance of the cells changes over their life. As a result, the resting voltage should be different than the charging voltage. Given the accuracy requirements on the measurement system and the potential impact on battery life, it would seem that measurements would need to be done after a resting period. Does the system use an estimate all of these resistances in order to correct the charge termination voltage? If so, how much error does this approach introduce? Does charging accuracy degrade as the cells degrade due to different rates of increase in the cell resistances? Is the four-hour delay between charge termination and cell balancing related to measuring these cell resistances or is that more designed to ensure all of the cells are nearly isothermal or perhaps both?
b) My understanding is that this battery is a top-balanced system and that the shunts are capable of about 1A. If the pack charges at 8A normally, the shunts will be quite limited unless the charging tapers off drastically. Does the charge terminate whenever the *highest* cell reaches 4.1V or does it target the *average* cell. Given that surfingslovak reports seeing only a small difference in charge-termination voltages, it seems like it may target the average. Do you know what the range of cell-pair voltages is at the end of a 100% charge (worst case, for a poorly-balanced pack)? How about at the end of an 80% charge?
c) Perhaps KtG's excellent suggestion that the pack is not sufficiently isothermal for proper charging is right on. And others have questioned Nissan's decision to place cells in multiple orientations. KtG believes the rear stack gets the hottest. I wonder if the top modules in the horizontal stacks get the hottest. The question is: How much temperature variation do you see between the four temperature probes during pack charging? Is it 2C or 20C? Something in between? Certainly having a long resting period before initiating charging gives the best possibility for an isothermal pack, at least at the beginning of charging.

I must say that I am a big proponent of Nissan's decision to forego a TMS for the battery system. Clearly that is not workable for hot areas like Phoenix with the current generation of batteries, but I feel it is a good solution for future battery designs which incorporate heat-tolerant cells. But I must say that I am now wondering how often some cells end up sitting at or near 100% after an 80% charge due to pack imbalance and temperature variations. I am becoming more-and-more convinced that keeping the LEAF between about 25% and 60% SOC when not in use is a key for a long life.
RegGuheert
2011 Leaf SL Demo vehicle
10K mi. on 041413; 20K mi. (55.7Ah) on 080714; 30K mi. (52.0Ah) on 123015; 40K mi. (49.8Ah) on 020817; 50K mi. (47.2Ah) on 120717; 60K mi. (43.66Ah) on 091918.
Enphase Inverter Measured MTBF: M190, M215, M250, S280

User avatar
RegGuheert
Posts: 6419
Joined: Mon Mar 19, 2012 4:12 am
Delivery Date: 16 Mar 2012
Leaf Number: 5926
Location: Northern VA

Re: LEAF voltage measurement accuracy impact on capacity

Wed Sep 19, 2012 3:19 am

drees wrote:
surfingslovak wrote:For what it's worth, I believe that we saw higher voltages after turtle mode in degraded Leafs than I would have expected. It was around 320V from what I have seen. My Leaf would typically hit turtle around 308V, but I did not let the car rest and measure later. Has anyone done that with a relatively new Leaf?
That's interesting. 2 possible reasons I can think of:

1. Some modules are weaker than the others and the LEAF uses the voltage of the weakest module (or cell-pair?) to determine when the party ends.
2. With it being Arizona, the pack will be warmer. With a warm pack, voltage will sag less under load. Of course, the test was run in the morning during pretty normal temps that one can easily recreate in the Bay Area in the summer.
Personally, I think it is the first explanation. I see this as evidence that some of the cells in these degraded LEAFs are worse than others, perhaps significantly worse. I think it would be interesting to run the CELL VOLTAGE LOSS INSPECTION test in the EVB service manual and plot a histogram of the 96 cell-pair voltages to see what the distribution looks like for these degraded batteries. It might also be telling to see a chart showing module voltage from that test versus location in the pack. Hopefully we will be able to access such data using a LEAFscan.
RegGuheert
2011 Leaf SL Demo vehicle
10K mi. on 041413; 20K mi. (55.7Ah) on 080714; 30K mi. (52.0Ah) on 123015; 40K mi. (49.8Ah) on 020817; 50K mi. (47.2Ah) on 120717; 60K mi. (43.66Ah) on 091918.
Enphase Inverter Measured MTBF: M190, M215, M250, S280

User avatar
Ingineer
Posts: 2741
Joined: Fri Oct 15, 2010 1:09 pm
Delivery Date: 13 Jul 2011
Leaf Number: 6969
Location: Berkeley, California
Contact: Website

Re: LEAF voltage measurement accuracy impact on capacity

Wed Sep 19, 2012 11:23 am

The biggest problem with the Leaf's BMS (in my opinion) is the use of the Hall-effect current sensor. These are not very accurate for coulomb counting and subject to accuracy degrading effects, such as centerline drift, effects of the earths magnetic field, temperature, etc. The inaccuracy of this is why "some gids are more equal than others". Nissan compensates for this inaccuracy by applying corrections to the SoC by sampling voltage and using it formulas that also take into account the temperature, internal resistance, aging, etc. This is why you can gain/lose SoC suddenly sometimes after power cycling. It will apply changes all at once if the car is power cycled, but if in use, it will apply a correction in the form of a drift which appears as faster/slower SoC counting than real energy out/in.

I was able to meet with the Nissan engineers from Japan last December, including the battery system engineer (I had a one-on-one with him). Their explanation for why we have no direct SoC display in the car was basically that they were afraid to show it and have these corrections occasionally make it "jump" which would "confuse the customer". The Battery Systems Engineer told me that cost was the reason they used the Hall-Effect current counter rather than a more-accurate galvanic shunt.

It's looking like there is some degradation in these hot-climate packs, but it appears that the BMS (LBC) is not dealing with it properly, and not only indicating incorrect loss figures, but also possibly not allowing for full use (charging) of the packs real capacity.

Keep in mind, Nissan did a lot of testing, but the bulk of it is accelerated life tests, which attempt to simulate a much longer real-world use scenario. Unfortunately sometimes there is no substitute for real-world life testing, and it sounds like there are some unexpected results that the BMS software is not equipped to deal with.

Also remember that large automakers, especially Japanese ones, are very methodical about changing things, and it takes a long time to properly implement a fix. If that fix involves software in a critical system, (the LBC for example) it will take many hours of testing before they will even consider releasing it. I believe they will fix this, but it will be done on their terms which means it will take some time before we see a solution.

-Phil
Easily Learn Electricity HERE! - - - - Website: http://evseupgrade.com/[/size] - - - - Like us on Facebook: EVSE Upgrade

Return to “Problems / Troubleshooting”