The battery impedance can vary, but I am confident that my calculation is reasonably close (+- 20%) because it is based upon the DC voltage drop I have observed under load with the Gid meter.
Tom, I believe that I found your original post
about this. While I agree with your findings, I had to ask, since my own test indicated somewhat lower internal resistance in January. Phil reportedly measured
92 milliohm last month. Someone just reminded me that internal resistance varied as a function of temperature, SOC itself, and the phase of the moon
I like the 1% of nominal motor power approximation, which should be close enough for our purposes.
Since my post I realized we can calculate the worst-case temperature rise from battery discharge by neglecting all heat transfer out of the battery pack.
For a 650 lb battery pack, this yields (51/650) = .079 deg F per min, or 2.4 F rise at 30 kW for 30 minutes.
At 50 kW power level, battery loss scales quadratically, so we have (50/30)^2 * 2.4 = 6.6 F at 50 kW for 30 min.
Yes, that's exactly what I was considering as well. Thank you for outlining it so eloquently. I did some back-of-the-envelope calculations, and my values were higher. It looks like BTUs assumes water, which has a specific heat of 4.18 J/g/K. I wanted to suggest that we used steel with specific heat of 0.49 J/g/K instead, but then I found a battery conference report
, which pegged the specific heat of lithium-ion batteries at 0.8 J/g/K. It looks like there is quite a bit of aluminum in the battery as well (0.9 J/gK), and I suggest that we used 0.8 J/g/K
for the entire pack. This means that the temperature delta from ambient you calculated would have to be be multiplied by 5.22:
Adjusting your figures for a 650 lb battery pack, this yields (51/650*5.22) = 0.412 deg F per min, or 12.5 F
rise at 30 kW for 30 minutes. And using the handy quadratic scale for 15 and 50 kW power levels:
(15/30)^2 * 12.5 =3.1 F
at 15 kW for 30 min.
(50/30)^2 * 12.5 =34.7 F
at 50 kW for 30 min.
The 3.1 F is what I roughly measured during a 31-minute test run with about 15 kW average power output this week. It would be interesting to get some other power levels for comparison. This would also mean that the majority of the waste heat is contained within the battery pack itself, and that it's not dispersed by the chassis of the vehicle.
Although this might seem odd, I believe that I measured elevated temps at the lower door body panels, and was able to see the consequences of solar loading as well. Both of these effects are measurable, but fairly benign. They yielded about 1 or 2 degrees difference.
The same process that causes the t^.5 loss of capacity also raises the battery impedance, so this heating will rise with age, but it should still not be a concern under most circumstances.
Yes, I thought so too and I was hoping that we would see the square-root-of-time relationship we saw in the NREL report. However, given the anecdotal evidence of disproportional range loss and the fact that I'm unable to get the same energy economy like last year when following the same driving protocol, I have a sneaking suspicion that internal resistance might be rising faster than anticipated. I'm really curious if Phil will be able to get 92 milliohm six months from now.
The graph I posted in another thread shows the impedance rising rather rapidly at very low SOC, so high power down here is not advised (and ultimately limited by the BMS). It also shows the charging process as being slightly endothermic at low SOC levels, and then becoming exothermic as the SOC level rises. This is consistent with the QC tapered charging profile.
I don't disagree with the report, and I find it interesting. I did not find a reference to the lithium-ion chemistry they used however and I was unable to find any evidence that the charging process is endothermic at any SOC even though I tried. I used a Fluke 62 IR gun
to collect quite a bit of data on my own vehicle and on others as well. Additionally, I found the quote I've been looking for, and according to members of the Leaf design team, the vehicle develops more waste heat during charging than during vehicle operation. I believe that this contradicts the report you found and it would be another reason to take it critically:
Click to open
The picture of the Tesla roadsters cooling off after short runs at high power was dramatic. The Tesla S is supposed to have more robust cooling of both its battery and motor.
Indeed! Both of the Tesla vehicles use cylindrical 18650 cells, which supposedly have much worse thermal properties than the pouch cells used in the Leaf and in the ActiveE. We knew from last year that the Leaf will perform well, but I was baffled by ActiveE'a poor showing. I collected some additional data on my own vehicle during the course of this week, and couldn't help noticing that battery temperature can rise 20 F
above ambient when driving the ActiveE hard on the freeway for 10-15 minutes.
This simply isn't the case with the Leaf, as your calculations have shown above, and it must be the result of higher motor power in the ActiveE, battery insulation and smaller battery mass. I believe that they used NMC, which is a bit more dense than what the Leaf is using. When you realize that BMW starts limiting motor power around 102 F, which roughly corresponds to seven temp bars on the Leaf, you know why the ActiveE didn't place better.
I just find it surprising that active battery cooling didn't buy BMW anything on the track last Sunday.