Extra Battery, How to Integrate with 24kWh Traction Battery?

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.
Understanding how DCQC works without having a firm understanding of basic power electronics and switching power supplies will always get you a bunch of incongruities and confusion, as any simplified layman model of the components required for a DCQC+electric car system will have essential components removed for clarity. Add in more confusing concepts like requesting a voltage vs current and there will be no end to the discussion.

Charging any battery as quickly as possible is done through a constant-current-then-constant-voltage process, where the current into the battery is limited to some maximum amount until the per-cell voltage reaches - in the old Leaf - 4.08V, and then the charger will effectively become a constant voltage supply at that voltage, with the battery drawing whatever current it does at that particular voltage.

There is no such thing as a perfect current source or a perfect voltage source, and as such any power supply, including a DCQC station, will always be both in some way. Likewise, batteries are not just voltage sources - they have internal resistance and they heat up during use, so there are additional requirements to safely charging and discharging. The battery cannot willy nilly ask for any amount of current and receive it, nor can the DCQC just supply any amount of current without regard for voltage and temperature. These are parameters that must be communicated back and forth, and these are exactly the kinds of things you see running over the CHAdeMO CAN-bus.

Additionally, there are no voltage sense wires going over the CHAdeMO protocol, so the charger has no direct way of knowing what the actual battery voltage is, without the resistance of all the wires going to it. It's charging blindly on CAN-messages, essentially.

Taking all of this into account, pretty much the only way to ever design a DCQC station is to first let the station voltage-match to the battery, and then increase the supply voltage stepwise up to the point where either the maximum battery voltage (4.08V/cell, approx. 390V in the Leaf) is reached, or until the battery says it can't take any more. Anything else on the HV bus will simply be along for the ride at that point, drawing whatever current it draws at that amount of overvoltage. This is how the extender is charged, and how ANY car with CHAdeMO or CCS will be able to add in an extender and have it quick-charge effortlessly.

This is exactly what I see happening in my car. Near empty, I charged at a CHAdeMO station. The station supplied approx. 110A to the car, of which 80 was going into the extender and 30 into the main battery.
 
mux said:
Understanding how DCQC works without having a firm understanding of basic power electronics and switching power supplies will always get you a bunch of incongruities and confusion, as any simplified layman model of the components required for a DCQC+electric car system will have essential components removed for clarity. Add in more confusing concepts like requesting a voltage vs current and there will be no end to the discussion.

Charging any battery as quickly as possible is done through a constant-current-then-constant-voltage process, where the current into the battery is limited to some maximum amount until the per-cell voltage reaches - in the old Leaf - 4.08V, and then the charger will effectively become a constant voltage supply at that voltage, with the battery drawing whatever current it does at that particular voltage.

There is no such thing as a perfect current source or a perfect voltage source, and as such any power supply, including a DCQC station, will always be both in some way. Likewise, batteries are not just voltage sources - they have internal resistance and they heat up during use, so there are additional requirements to safely charging and discharging. The battery cannot willy nilly ask for any amount of current and receive it, nor can the DCQC just supply any amount of current without regard for voltage and temperature. These are parameters that must be communicated back and forth, and these are exactly the kinds of things you see running over the CHAdeMO CAN-bus.

Additionally, there are no voltage sense wires going over the CHAdeMO protocol, so the charger has no direct way of knowing what the actual battery voltage is, without the resistance of all the wires going to it. It's charging blindly on CAN-messages, essentially.

Taking all of this into account, pretty much the only way to ever design a DCQC station is to first let the station voltage-match to the battery, and then increase the supply voltage stepwise up to the point where either the maximum battery voltage (4.08V/cell, approx. 390V in the Leaf) is reached, or until the battery says it can't take any more. Anything else on the HV bus will simply be along for the ride at that point, drawing whatever current it draws at that amount of overvoltage. This is how the extender is charged, and how ANY car with CHAdeMO or CCS will be able to add in an extender and have it quick-charge effortlessly.

This is exactly what I see happening in my car. Near empty, I charged at a CHAdeMO station. The station supplied approx. 110A to the car, of which 80 was going into the extender and 30 into the main battery.

Thanks for your primer on DCQC. Sorry to divert the thread. Please continue providing updates on your progress.
 
Many of you know that we (Quick Charge Power) build JdeMO, which is an aftermarket CHAdeMO fast charge kit for the 2012-2014 Toyota RAV4 EV, as well as the 2008-2011 Tesla Roadster. The kit for the 2014 - 2018 Mercedes B-Class ED / B250e is underway for the fall of 2018. All of these cars use a Tesla battery.

I’m not sure why the above poster doesn’t believe that there is a battery voltage measuring capability for the charger. It’s certainly in the spec. In addition, the max battery voltage is announced via CAN, as well as max amps.

Fundamentally, this battery voltage measuring happens at the moment that the contactors close on both the charger and vehicle (after the plug has locked, the only time it safe to do this measurement).

The charger must physically measure a CHAdeMO compatible voltage of 50 to 500 volts DC, before it will allow any power to be sent to the battery. The future spec will be 50-1000 volts.

The vehicle is sending one simple control message via CAN... how many amps in 100ms time increments. The zero amp message is a shutdown. At the moment both charger and vehicle High Voltage contactors close, the charger physically measures a CHAdeMO compatible 50-500 volts, and the vehicle begins sending its amp request at a ramp up rate of up to 20 amps per second.

The current maximum amp request from the vehicle is 400 amps. The 2011-2018 LEAF is 125 amps maximum, and the 2019 LEAF e+ with liquid cooled LG-Chem cells will be 200 amps maximum.

Previous to contactors closing, a series of analog enable messages are exchanged (0-12 volt DC), as well as basic digital CAN massages (max amps, max voltage).

The plug is locked to the vehicle when the this communication begins. The charger provides a 0-500 volt insulation / isolation test while still disconnected from vehicle battery, looking for any reason not to start a charge with deadly high voltage DC.

t0 ————- contactors close
t0 + 500ms - charger confirms 50-500 volts DC from vehicle
t0 + 1sec —- up to 20 amps requested by vehicle
t0 + 5sec —- 120 amps requested (assuming this is max for vehicle)
t0 + XX min - maximum battery voltage reached as determined by vehicle (395-400v in LEAF)
t0 + XX min - vehicle amp request lowered to not exceed max voltage
t0 + end —— vehicle determines charge complete, sends zero amp message, both contactors open
t0 + end+ —- charger measures zero voltage, unlocks plug
 
lorenfb said:
3. We are both attempting to determine how the DCQC might function with two batteries connected in parallel and where one battery is controlling the total charging current. Unless you can document that the DCQC is a current source, the charging method whether using
L1/L2 or DCQC is the same - using a voltage source input and pulse width modulating the input voltage to achieve a desired charging current.
Obviously the DCQC device becomes much more costly for the electric company providing the charging station and they're no longer providing a simple service as usual which is just supplying a simple voltage to its users.

It’s great to see that you are consistent in your cluelessness with electronics as you are about Tesla.

You quite literally don’t bother to know or understand anything; you just spew.
 
TonyWilliams said:
(...)

I’m not sure why the above poster doesn’t believe that there is a battery voltage measuring capability for the charger. It’s certainly in the spec. In addition, the max battery voltage is announced via CAN, as well as max amps.(...)

My comment is specifically about there not being a continuous battery voltage measurement. There are no sense wires, the only thing CHAdeMO does is measure the no-load voltage at the beginning of charge. It cannot do a continuous voltage measurement on the actual battery poles to do a proper CCCV. Any measurement under load will necessarily include the parasitic resistance across the wires, which is very significant at decent charging rates. This also limits the usefulness of DCQC at high SoC, as the charger is very conservative with its termination voltage. Most CHAdeMO stations don't even have sense wires in the end of the charging cable - they just limit the maximum voltage at the charger output in the 'box', not at the connector end and certainly not at the battery poles. This is at least true for the ABB CHAdeMO chargers that are ubiquitous in the Netherlands.

That's all. I'm currently reading up on the spec, as I'm currently designing a CHAdeMO charger to install at home, using a bunch of spare batteries as a high-power buffer.
 
mux said:
TonyWilliams said:
(...)

I’m not sure why the above poster doesn’t believe that there is a battery voltage measuring capability for the charger. It’s certainly in the spec. In addition, the max battery voltage is announced via CAN, as well as max amps.(...)

My comment is specifically about there not being a continuous battery voltage measurement. There are no sense wires, the only thing CHAdeMO does is measure the no-load voltage at the beginning of charge. It cannot do a continuous voltage measurement on the actual battery poles to do a proper CCCV. Any measurement under load will necessarily include the parasitic resistance across the wires, which is very significant at decent charging rates. This also limits the usefulness of DCQC at high SoC, as the charger is very conservative with its termination voltage. Most CHAdeMO stations don't even have sense wires in the end of the charging cable - they just limit the maximum voltage at the charger output in the 'box', not at the connector end and certainly not at the battery poles. This is at least true for the ABB CHAdeMO chargers that are ubiquitous in the Netherlands.

That's all. I'm currently reading up on the spec, as I'm currently designing a CHAdeMO charger to install at home, using a bunch of spare batteries as a high-power buffer.

That poster always finds away to promote his products by attacking others on MNL, which resulted in being banned multiple times.
As a result of his being so personally obnoxious, I only attended the Leaf/EV meetings here in SoCal only twice when I first acquired
my Leaf in 2013. Just ignore him. He likes to take things out of context to imply a "superior knowledge" and to promote his products.
How coincidental was it that he presents a description of how the DCQC functions after a link; (http://mynissanleaf.com/viewtopic.php?f=8&t=6522&start=210#p523066),
to a PDF was presented? As you point out, there're many very detailed aspects of the DCQC spec that many even with "expert" knowledge
lack and are learning over time.
 
Ah, right, I thought it was a bit out of left field to start calling people names out of nowhere and completely out of context. If there is any advice I can give Mr. Williams, it's: if you want to sell your products, don't be a dick.

Regardless, I don't have a beef with any of you and all I'm after is to either educate, be educated or contribute meaningfully to the highest possible utility of old Nissan Leafs. If you disregard the ad homs, this page has been a very good source of useful DCQC information and discussion. I'm not invested enough in this community to care much about the interpersonal stuff that comes with any community.
 
mux said:
Ah, right, I thought it was a bit out of left field to start calling people names out of nowhere and completely out of context. If there is any advice I can give Mr. Williams, it's: if you want to sell your products, don't be a dick.

Regardless, I don't have a beef with any of you and all I'm after is to either educate, be educated or contribute meaningfully to the highest possible utility of old Nissan Leafs. If you disregard the ad homs, this page has been a very good source of useful DCQC information and discussion. I'm not invested enough in this community to care much about the interpersonal stuff that comes with any community.

I’m not selling products here, as clearly NONE of those listed fit any Nissan vehicle. I’m also not sure why you are responding this way. Truly bizarre.

The poster who I did address is a complete buffoon, and is on my block list. Some others blocked are classics like Ed?? and OrientExpress, too. Bloviating wastes of time.

While I agree that this thread is great, I’ll also point out that I started it. Do I get a brownie point?

6.5 years ago... Extra Battery, How to Integrate with 24kWh Traction Battery?

Tue Oct 25, 2011 9:56 pm
 
lorenfb said:
That poster always finds away to promote his products by attacking others on MNL, which resulted in being banned multiple times.
As a result of his being so personally obnoxious, I only attended the Leaf/EV meetings here in SoCal only twice when I first acquired
my Leaf in 2013. Just ignore him. He likes to take things out of context to imply a "superior knowledge" and to promote his products.
How coincidental was it that he presents a description of how the DCQC functions after a link; (http://mynissanleaf.com/viewtopic.php?f=8&t=6522&start=210#p523066),
to a PDF was presented? As you point out, there're many very detailed aspects of the DCQC spec that many even with "expert" knowledge
lack and are learning over time.

Yes, I quickly read that link, and became an “expert” in 5 minutes. How on God’s earth did we get that CHAdeMO stuff working and sold to hundreds of customers worldwide for the past several years? Maybe you’ll claim next that you were the brains behind it all? Must be “lorenb logic”, the same logic applied in your many flawed posts.

Listen, I doubt that I could pick you out of a line up of “angry old folks”, so even if you ever were at our group breakfast meetings, who would know? Not me, and not anybody that I know or respect. Maybe you’re paranoid?

I hope you are still losing your butt shorting TSLA, too. I’m smiling big knowing there are people like you out there.
 
mux said:
TonyWilliams said:
(...)

I’m not sure why the above poster doesn’t believe that there is a battery voltage measuring capability for the charger. It’s certainly in the spec. In addition, the max battery voltage is announced via CAN, as well as max amps.(...)

My comment is specifically about there not being a continuous battery voltage measurement. There are no sense wires, the only thing CHAdeMO does is measure the no-load voltage at the beginning of charge. It cannot do a continuous voltage measurement on the actual battery poles to do a proper CCCV. Any measurement under load will necessarily include the parasitic resistance across the wires, which is very significant at decent charging rates. This also limits the usefulness of DCQC at high SoC, as the charger is very conservative with its termination voltage. Most CHAdeMO stations don't even have sense wires in the end of the charging cable - they just limit the maximum voltage at the charger output in the 'box', not at the connector end and certainly not at the battery poles. This is at least true for the ABB CHAdeMO chargers that are ubiquitous in the Netherlands.

That's all. I'm currently reading up on the spec, as I'm currently designing a CHAdeMO charger to install at home, using a bunch of spare batteries as a high-power buffer.

But, there is a continuous measuring of both pack and cell pair voltages. It’s just done in the vehicle. The BMS is monitoring that, and sending that amp request every 100ms.

Sounds like a fun project that you have started. One thing to understand in any of the various DC fast charge standards is that the vehicle ALWAYS controls the charge rate. The charger is a dumb box, doing what it is told, within the programmed limits.
 
TonyWilliams said:
But, there is a continuous measuring of both pack and cell pair voltages. It’s just done in the vehicle. The BMS is monitoring that, and sending that amp request every 100ms.

Sounds like a fun project that you have started. One thing to understand in any of the various DC fast charge standards is that the vehicle ALWAYS controls the charge rate. The charger is a dumb box, doing what it is told, within the programmed limits.

So my takeaway from this is that in this parallel battery scenario, the BMS for the main batt is going to request a certain amperage based on the state of the main battery, and the add-on is going to take some of that, slowing the charge rate. The upshot being that the whole charging process will go slower that it theoretically could if additional amperage for the add-on were requested. This will be tempered by the amount of time the charger spends at it's maximum output where we couldn't charge any faster anyway.
 
TonyWilliams said:
mux said:
That's all. I'm currently reading up on the spec, as I'm currently designing a CHAdeMO charger to install at home, using a bunch of spare batteries as a high-power buffer.

Sounds like a fun project that you have started. One thing to understand in any of the various DC fast charge standards is that the vehicle ALWAYS controls the charge rate. The charger is a dumb box, doing what it is told, within the programmed limits.

Yes, this is very much what attracted me to this idea for a project. It's a lab power supply! It's a welding inverter with CAN! It's just such a useful thing to have and right up my alley (I'm a power electronics design engineer by trade). I've got some even better ideas that can probably enable a CHAdeMO charger for specific vehicle types with no dc/dc converter on a shoestring budget. Anyway, that's a story for another time.

More ontopic: I expect my CAN spoofer hardware to arrive in about 2ish weeks. I've finished reverse engineering my extender battery CAN protocol, which enables me to read their cell voltages, balance the pack and gauge remaining capacity. I'm also hoping to procure a few CAN current sensors. All of this should combine to form a system that shows the correct remaining range in the GoM and report the correct remaining charge time etc.. I should also be able to precisely calculate how much of the juice is coming from the main pack and from the extender. If and when I get all of that done, I think this will be pretty much exactly what people would want from a battery extender. This will all get tutorialized.
 
davewill said:
So my takeaway from this is that in this parallel battery scenario, the BMS for the main batt is going to request a certain amperage based on the state of the main battery, and the add-on is going to take some of that, slowing the charge rate. The upshot being that the whole charging process will go slower that it theoretically could if additional amperage for the add-on were requested. This will be tempered by the amount of time the charger spends at it's maximum output where we couldn't charge any faster anyway.

To the (original) BMS, the additional battery should just look like a power transmission or efficiency loss. In other words, can the BMS tell the difference between a high resistance cord and the additional battery? Eg, the BMS requests X amps, monitors the cell voltages, sees they are at a certain state, requests X+Y amps, repeats until the cell voltage feedback reaches the target. The BMS can't tell whether power is being lost in the cable or to the add-on battery, it only sees what effect the amperage request is having on the cells it is measuring and adjusts the amperage request accordingly. Of course, there may be some sanity checking limits on the expected behavior that the BMS will not exceed (I hope so!) but it's possible that if the current draw from the additional battery is small enough relative to the normal range of operation that the BMS will charge at full speed (assuming no other limits).
 
goldbrick said:
davewill said:
So my takeaway from this is that in this parallel battery scenario, the BMS for the main batt is going to request a certain amperage based on the state of the main battery, and the add-on is going to take some of that, slowing the charge rate. The upshot being that the whole charging process will go slower that it theoretically could if additional amperage for the add-on were requested. This will be tempered by the amount of time the charger spends at it's maximum output where we couldn't charge any faster anyway.

To the (original) BMS, the additional battery should just look like a power transmission or efficiency loss. In other words, can the BMS tell the difference between a high resistance cord and the additional battery? Eg, the BMS requests X amps, monitors the cell voltages, sees they are at a certain state, requests X+Y amps, repeats until the cell voltage feedback reaches the target. The BMS can't tell whether power is being lost in the cable or to the add-on battery, it only sees what effect the amperage request is having on the cells it is measuring and adjusts the amperage request accordingly. Of course, there may be some sanity checking limits on the expected behavior that the BMS will not exceed (I hope so!) but it's possible that if the current draw from the additional battery is small enough relative to the normal range of operation that the BMS will charge at full speed (assuming no other limits).
Or look like an accessory.

You can run your lights, A/C, heater, etc. while charging off of CHAdeMO or EVSE. That too is current diverted from the battery.
 
IssacZachary said:
You can run your lights, A/C, heater, etc. while charging off of CHAdeMO or EVSE. That too is current diverted from the battery.

In principle, yes. The difference is that the car and BMS can be aware that the heater is on while the additional battery is invisible to the BMS except that is an (unaccounted for) power sink.
 
goldbrick said:
IssacZachary said:
You can run your lights, A/C, heater, etc. while charging off of CHAdeMO or EVSE. That too is current diverted from the battery.

In principle, yes. The difference is that the car and BMS can be aware that the heater is on while the additional battery is invisible to the BMS except that is an (unaccounted for) power sink.
True. But it would be hard to make a system that can monitor where every last milliamp goes. The car has an idea of what power the accessories are using, but doesn't say "5 amps to A/C, 2 to inverter and 51 to battery, please send 58 amps through CHAdeMO." It's more like, "Some power might be used by accessories. But what voltage is the battery at? Send more or less current according to what the battery is doing."
 
IssacZachary said:
True. But it would be hard to make a system that can monitor where every last milliamp goes. The car has an idea of what power the accessories are using, but doesn't say "5 amps to A/C, 2 to inverter and 51 to battery, please send 58 amps through CHAdeMO." It's more like, "Some power might be used by accessories. But what voltage is the battery at? Send more or less current according to what the battery is doing."

It does make guesses for the energy usage screen on the Nav. Presumably it could do the same when charging. I have to check whether it is actually requesting more power from the DCQC station if the AC/heater is on next time.

Honestly I find it quite surprising that it doesn't see such a large difference in requested power and power into the battery and stop the charge. Fortunate for us, but it's a strange omission; it can sense milliamps flowing through a ground short via GFCI yet 25kW going away into another battery it doesn't know about, no issue!
 
Yes. If there is more power available from DC station, vehicle will consume it.
When consumers are activated (AC, PTC etc) they all act as one big consumer.
Another battery will act the same. Power is measured by DC station (blue) but
not by battery (green).
Notice me turning on heater in the middle of DC charge cycle (power that goes
into battery continues to drop but current supplied by DC station spikes and
then later on, when cabin heats up, is hardly noticeable (less than 1kW), and at
the moment I did the screenshot I opened door and HVAC cranked up again,
notice 8,4kW battery input and 11,8kW DC station output.

Half of the graph is filled with home charging (last drops) before my departure.
Note that maximum current (120A) was available for few minutes, and then
it started dropping rapidly (first cell reached 4.1V early) - chilly battery (+15*C).

Screenshot_20180330-182327.png
 
Re this entire scenario: yes, it is technically possible for the car to accurately account for where all the current is going. But there is well over 10kW worth of non-charging consumers on the car that do not contain any kind of power measurement devices. Power measurement is hella expensive - CAN current sensors that are accurate enough to do this stuff are $300-400 a pop. Imagine having to put one on the AC, heater, DC/DC, battery PTC heater... that's a massive expense on a car that's already being squeezed as far as budget goes. It's simply not practical, so the manufacturer simply tries to get away with the absolute minimum possible. The only current measurement in the entire car is one shunt in the battery pack and that's it. The rest of the stuff you see on the energy info screen is just guesswork by the infotainment's computer.

Combine this with the inability of the charger itself to accurately account for where the current goes and how much voltage drop there is everywhere, and any kind of closed-loop system that tries to spot issues with excess current draw is going to throw false positives all the time. This is not robust nor, honestly, necessary for safety. The car doesn't explode if it draws excess current, the car explodes if battery temps go haywire. So that's directly measured, and the excess current is just... whatever. The car doesn't care.
 
lorenfb said:
And this guy...
Did you really just cherry pick a 7 year old post? I suppose that nobody could learn anything in 7 years? And of course he couldn't hire smart people to help with the development!

mod note: I deleted the post in question. I don't have time to search every thread for such occurrences. Please report posts if necessary..
 
Back
Top