The Anatomy of a Gid

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
mwalsh said:
turbo2ltr said:
FairwoodRed said:
This unit was found by Gary Giddings by sniffing the cars CAN bus and it was named GID in his honor.

Not really true... I just didn't publicize it..

http://www.mynissanleaf.com/viewtopic.php?f=8&t=2794" onclick="window.open(this.href);return false;

Oh dear. You know what this means don't you...it means you're the Martin Eberhard of the Gid world. And that Gary is the Elon Musk. But without the fame, and the money, and the women! ;)

Can we call them "Turbo-Gids"?
 
The reports that one GID corresponds to 80 watt-hours of energy
being "put into" the battery might be correct, but some of that energy
never gets stored in the battery, but is lost as heat while the cells are
changing.

Also, not all of the stored energy makes it back out of the battery,
again due to heating of the cells as current flows out of the cells.

So, 80 Wh applied to the battery might produce 75 Wh of unable
energy out of the Battery Pack, but certainly not 80 Wh.

However we observe that the GIDs follow the amount of energy
(fuel) in the battery, not the "State of Charge" (SOC) of the battery.

A small battery can be "full" and have a high (near 100%) SOC, but
not contain very much energy. When a "Big" battery is used, it can
hold a lot of energy. A smaller, or "shrunken" battery, even when full,
will not hold so much energy.

So, the GID value is, for now, the best "fuel" gauge we have.
 
TonyWilliams said:
mwalsh said:
turbo2ltr said:
Not really true... I just didn't publicize it..

http://www.mynissanleaf.com/viewtopic.php?f=8&t=2794" onclick="window.open(this.href);return false;

Oh dear. You know what this means don't you...it means you're the Martin Eberhard of the Gid world. And that Gary is the Elon Musk. But without the fame, and the money, and the women! ;)

Can we call them "Turbo-Gids"?

How about TGs!?
 
mwalsh said:
I'm off to find that one thread where Gids were reported behaving oddly in colder weather. :)
I don't know which thread that might be, but I have observed less range per gid with a cooler battery. Driving up the mountain I live on, involving a ~5000' climb, I track my charge closely. I shoot for a little under 10% charge (in gids) per 1000' climbed on CA-330. Now that our LEAF's battery temperature is lower, my gid usage is noticeably higher, sometimes 11% per 1000' climbed.
 
abasile said:
mwalsh said:
I'm off to find that one thread where Gids were reported behaving oddly in colder weather. :)
I don't know which thread that might be, but I have observed less range per gid with a cooler battery. Driving up the mountain I live on, involving a ~5000' climb, I track my charge closely. I shoot for a little under 10% charge (in gids) per 1000' climbed on CA-330. Now that our LEAF's battery temperature is lower, my gid usage is noticeably higher, sometimes 11% per 1000' climbed.
Can't say where the thread is, but the data is here. From the time I started logging last October, I saw a steady rise in the gid count for a 100% charge although my range did not increase (continued to slowly decline).
 
turbo2ltr said:
FairwoodRed said:
This unit was found by Gary Giddings by sniffing the cars CAN bus and it was named GID in his honor.

Not really true... I just didn't publicize it..

http://www.mynissanleaf.com/viewtopic.php?f=8&t=2794" onclick="window.open(this.href);return false;

Ahem. First publication on MNL of behavior of EV-CAN msg 0x5BC that I could find:
http://www.mynissanleaf.com/viewtopic.php?f=44&t=4131&start=30#p98570" onclick="window.open(this.href);return false;

I know I would have spend less time looking for it if someone else had published it first. :)
 
OK. I don't think it's a simple polynomial translation anymore. I've been experimenting with polynomials up to 7th order and cannot seem to get anything better than gids even if I split it into two sections at the knee. I think someone said that Nissan does integrate the actual charge and does "corrections" periodically. I can only imagine these "corrections" are based on the pack volts somehow.

A more targeted formula may work better than a polynomial. Any of the battery geeks out there know of any papers suggesting a formula translating charge to pack volts? It does look a bit like a natural log. The idea is if I can get a good enough formula of charge vs. voltage, we could calibrate it to current battery conditions during charging and use that to indicate remaining capacity. Can try a big look-up table but an equation would be ideal.
 
GroundLoop said:
turbo2ltr said:
FairwoodRed said:
This unit was found by Gary Giddings by sniffing the cars CAN bus and it was named GID in his honor.

Not really true... I just didn't publicize it..

http://www.mynissanleaf.com/viewtopic.php?f=8&t=2794" onclick="window.open(this.href);return false;

Ahem. First publication on MNL of behavior of EV-CAN msg 0x5BC that I could find:
http://www.mynissanleaf.com/viewtopic.php?f=44&t=4131&start=30#p98570" onclick="window.open(this.href);return false;

I know I would have spend less time looking for it if someone else had published it first. :)

The key point is 'published' and once the at-large community had isolated the EV-CAN message then Turbo2ltr published a spreadsheet with much more than just the EV-CAN message.

http://www.mynissanleaf.com/viewtopic.php?f=44&t=4131&start=40#p98628" onclick="window.open(this.href);return false;

Turbo had discovered the EV-CAN but had not disclosed it, Ingineer may also have knowledge about EV-CAN before the publication and has several other CAN understandings like SOC to 4 or 5 decimal places and battery temperatures. When the LEAFSCAN makes it's debut, we may learn more about CAN messages. Obviously all this 'detective' work is costly for the individual investigator and I can understand why Turbo and Ingineer try to recover some of their investment. It's a sticky topic for open source -- who pays for the information and development. Sorry for the run on dialog, but I appreciate Turbo2ltr, Ingineer, GroundLoop, Gary, Lincomatic, Chris, TickTock and many others that help us understand and decode the LEAF CAN message system. Someday I hope CAN will become more open and vehicles more programmable, so instead of hiding the GOM, we can try to develop our own tune parameters to match the prediction for each individual driver. But considering how we have to OK to send data, I'm not sure I will live long enough to see a SDK for a vehicle like a LEAF. Maybe if someone can show how programming a vehicle could promote sales without the legal boat anchors it will come to pass. I would enjoy my LEAF even more if it would drive me to work!
 
had a pretty large temperature drop last night so ran LEAF to "near" LBW (27 GID)

and plugged in. put in almost 20 kwh in BUT it charged to 281 (which dropped to 280 after car was on about 2-3 minutes) gaining 254 GID which equates to 78.7 watts per GID.

soooo, any explanation would be good
 
DaveinOlyWA said:
had a pretty large temperature drop last night so ran LEAF to "near" LBW (27 GID)

and plugged in. put in almost 20 kwh in BUT it charged to 281 (which dropped to 280 after car was on about 2-3 minutes) gaining 254 GID which equates to 78.7 watts per GID.

soooo, any explanation would be good
X 91% charging efficiency = 71.7 Wh/gid which is within the unofficially reported +/-10% variation (OK, slightly over).
 
TickTock said:
DaveinOlyWA said:
had a pretty large temperature drop last night so ran LEAF to "near" LBW (27 GID)

and plugged in. put in almost 20 kwh in BUT it charged to 281 (which dropped to 280 after car was on about 2-3 minutes) gaining 254 GID which equates to 78.7 watts per GID.

soooo, any explanation would be good
X 91% charging efficiency = 71.7 Wh/gid which is within the unofficially reported +/-10% variation (OK, slightly over).

would that not be backwards? 80 watts hours at 91 % efficiency means 87.9 watts from the wall correct?
 
TickTock said:
I thought the 20kW was from the wall. If this is actually into the battery then you are almost spot on (the 80Wh nominal target is into the battery).

it is from the wall. the previous reading was a "hair past 3750"

meter.jpg

280 GID.jpg


now, the car did nearly the same trip the day before with temps about 10º warmer and recorded 64.6 mile @ 4.3 miles/kwh from a 275 GID reading and recharged at 19 Kwh.
 
TickTock said:
OK. I don't think it's a simple polynomial translation anymore. I've been experimenting with polynomials up to 7th order and cannot seem to get anything better than gids even if I split it into two sections at the knee. I think someone said that Nissan does integrate the actual charge and does "corrections" periodically. I can only imagine these "corrections" are based on the pack volts somehow.

A more targeted formula may work better than a polynomial. Any of the battery geeks out there know of any papers suggesting a formula translating charge to pack volts? It does look a bit like a natural log. The idea is if I can get a good enough formula of charge vs. voltage, we could calibrate it to current battery conditions during charging and use that to indicate remaining capacity. Can try a big look-up table but an equation would be ideal.
(Much longer than I planned on... and I'm fairly certain you've seen most of this before.)

If you assume that the LEAF battery pack works like a laptop computer battery pack, perhaps some clues fall out. Speaking towards laptop batteries (and one 5kWHr 15-cell Li Ion battery pack I've worked with)....

Most of the time, the battery voltage is fairly constant, and doesn't tell you anything about the SOC. Voltage does change with temperature somewhat (I think voltage rises as temperature rises, possibly not helpful), or when the battery SOC is near its extremes of SOC during charge and discharge. In general, battery voltage is fairly constant across capacity given a constant current (charge, open circuit, or discharge).

On charge from low SOC, current is limited and cell voltage will rise rapidly. (At very low SOC (<5%), current may be further reduced until a minimum safe voltage is reached.) At some point, cell voltage rises to a plateau and doesn't change much (15-85% SOC). Voltage starts rising again at the end of the plateau. At some voltage threshold, a timer is started. When that timer runs out, charging is stopped, and the battery is considered to be at 100% SOC. (LEAF DCFC - detecting when battery voltage climbs off the plateau may not be so easy. The plateau may have a slope indistinguishable from a normal voltage rise seen near End Of Charge at normal current. And forget the timer - stop now to be safe.)

On discharge from 100% SOC, cell voltage starts high, but rapidly falls to another plateau, and stays there until voltage begins falling again (about 15% SOC).

Falling off the plateau is where I expect the LEAF to start generating Very Low Battery warnings, and soon, enter Turtle mode to protect the battery. Entry to Turtle mode may be when the pack is considered at 0% SOC, and would be the other reference point for battery capacity.

LEAF: Have the edges of the battery voltage charge and discharge plateaus been seen? Do the plateaus have a slope? Do the plateau voltages vary with temperature?

Laptops: The strongest indicator of SOC has always been integrating battery current (counting Coulombs). A large step towards an SOC indication is finding a way to measure battery current, or finding where an integration result is stored.

The LEAF energy display apparently indicates some measure of power for traction, climate, and other loads. You would want to find where that data is coming from. Perhaps the display data is already accurate enough if it can be captured. Convert power to current for SOC.

Laptops: Corrections to SOC are required for: calendar age of the battery, equivalent full charge/discharge cycles (i.e. the sum of all energy passing through the battery), time since last charge, temperature history since last charge, and time/energy since last equalization. Most factors work to reduce calculated battery capacity. This data and techniques to implement it are likely considered proprietary.

Equalization is an interesting event. It ensures that all cells are fully charged without overcharging any cells. During a low current charge near 100% SOC, cell voltages are monitored. If any cell voltage reaches a threshold voltage, pack charging stops, and the fully charged cell is discharged slightly. After a short time, pack charging resumes, and the cycle repeats. When all cells are fully charged, the pack is at 100% SOC - this is a critical reference point. As you might guess, equalization can take a long time.

A recalibration cycle usually starts with an equalization charge, followed by a discharge to when the first cell reaches 0% SOC (voltage?) threshold. During discharge, current is integrated. When the discharge is stopped, you know what the battery capacity is. A recalibration cycle is the only time that can increase the calculated battery capacity.

When does the LEAF perform equalization charges? It's possible that cells (or modules) that reach full charge are slightly discharged while the rest of the pack continues charging. A slight rise of battery pack temperature may also be detected - discharge power has to be dissipated somehow. Do recalibration cycles ever occur? A charge from Turtle mode to 100% using the trickle charger may perform a similar function.

Laptops: Total battery capacity is influenced by age, number of C/D cycles, temperature, and time/cycles since last equalization. These constantly decrease the calculated battery capacity.

(Enough! I have other things to do.... G'nite. :p )
 
The relatively flat dV/dT curve of Lithium-Ion is why coulomb counting is often used rather than voltage mapping . This is how the Leaf's battery management system (LBC) tracks SoC. The problem is that the low-cost sensor used is subject to drift and thus requires a periodic correction. Since it's not essential that the SoC be perfectly accurate in the middle area of SoC, there is not as much need to issue corrections. This works out fine, as it's difficult to do with such a flat curve. However, near either end, the voltage mapping can be much more effective, so you are more likely so see the corrections.

There is no equalization used, as the active balancing system works to keep the cells in-balance. (At least up to a point) It's designed to be able to keep cells balanced within a wide tolerance of cell condition, but the pack is only as good as it's weakest cell.

-Phil
 
Ingineer said:
The relatively flat dV/dT curve of Lithium-Ion is why coulomb counting is often used rather than voltage mapping . This is how the Leaf's battery management system (LBC) tracks SoC. The problem is that the low-cost sensor used is subject to drift and thus requires a periodic correction. Since it's not essential that the SoC be perfectly accurate in the middle area of SoC, there is not as much need to issue corrections. This works out fine, as it's difficult to do with such a flat curve. However, near either end, the voltage mapping can be much more effective, so you are more likely so see the corrections.

There is no equalization used, as the active balancing system works to keep the cells in-balance. (At least up to a point) It's designed to be able to keep cells balanced within a wide tolerance of cell condition, but the pack is only as good as it's weakest cell.

-Phil
Thanks Phil,

I'd certainly like to know more about how the active battery balancing system works (although I have a few guesses about that).

Two factors I can think of off the top of my head when trying to map battery voltage to SoC are battery current and battery temperature (or even module temperatures). More refined data may be possible with knowledge of the battery's health and/or age (aka battery impedance/resistance), or even the weakest cell voltage. (Do you have access to cell voltages?)

Is the current sensor drift stable (a mostly fixed bias), or seem to vary with temperature, recent self-heating, sensor hysteresis, battery voltage (12V or main battery), coolant temperature (just because), or does it appear to be random? Where is the sensor physically located? What kind of sensor? Is the sensor used by the battery charger (think DCFC or climate control while connected to utility power)?

Too much uncompensated drift could render the sensor somewhat useless, but then, it would have to sense a range of currents from +125A (DCFC and regeneration) to -225A (90kW into the inverter/motor), a range approaching 400A, and a resolution of (0.25A?). Fairly demanding. (I've assumed main battery voltage is 400V, but know its closer to 385V. I've also assumed one side of the main battery is grounded to the chassis, but grounding the center might make inverter design easier.)

Lots of questions, but maybe I'll jog something loose.

Brett
 
From what I've seen, it appears the drift is corrected. It is a hall-effect device located in the battery box. It is used for calculation of all energy in/out of the pack. The main issue is error at low currents, which enables the SoC to be off up to a few percent here and there. It is corrected during use if it's way off, and every time the Leaf is made ready.

Yes, the LBC (Battery Controller) tracks internal resistance and voltage of every block, and I do have access to these parameters.

-Phil
 
I now have the possibility of measuring the GIDs.

Last Friday the new Leaf at 640 miles gave me a 278 at 100% even after running the CC for 30 minutes while plugged in.
This morning the old Leaf at over 23000 miles gave me 281 at 100%. I know from the turtle to 100% tests done a week ago that the old Leaf needed about 6-7% less kWh from the wall, comparing the the new one. It was a bit warmer outside this morning, but the cars still showed around 60F in the garage.

I will have to do some reading on what the GIDs are, but I just tought I put this data here.
 
Back
Top