Ability of EV Charger to withstand frequency deviations

My Nissan Leaf Forum

Help Support My Nissan Leaf Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.

Reddy

Well-known member
Joined
Feb 11, 2011
Messages
1,544
Location
Pasco, WA
OK, all of you power electronics engineers, here is a query from a friend of mine studying in the smart grid area. I will PM Ingineer and TonyWilliams. Others who might know more? As a preface, I don't really understand this stuff, so please feel free to reply with technical or dumbed down information. I will pass along verbatim. Thanks in advance.
I was looking for some information about the ability of EV chargers to withstand frequency deviations from their supply. I thought this would be a straightforward Google query, but I can't seem to turn up anything that answers what I'm looking for. The closest sources discuss the ability of smart chargers to use frequency as a signal to determine when to charge (e.g. smart grid applications). That's actually the reverse of what I want. I just want to confirm my assumption that EV chargers are relatively insensitive to frequency variations in the AC source, much like other power electronic interfaces.
 
Under frequency load shedding ranges from 59.1- 57.8 Hz. Normal operation is from 59.95-60.05 Hz which is pretty tight. Yes, you could accurately measure grid frequency and vary charging power based off a droop curve. I think intelligent dispatch using a communications network is more effective since the location of load allows for managing grid congestion and power flows (as well as doing other fun stuff) but I digress...

For the Leaf and most other EV's, grid frequency is indeed irrelevant. There is power factor correction, sure, but everything is getting rectified to DC, then high frequency AC, then DC again (for isolation).

The exception is Tesla, which has given some owners trouble when charging on a non-inverter generator. I thought I read somewhere that they took grid frequency into account but I can't seem to find it right now. It may just be a voltage drop during startup as the cars go from no load to full nearly instantly. Tesla also does some funny stuff measuring voltage on the line to detect bad connections upstream of the car (I have no idea how it does this, perhaps making minute changes in current to detect the voltage drop of the line).

Hope that helps!
 
Thanks for the info. Here is more info from my friend that may illuminate the question further:
My question is not so much about the potential for frequency-dependent charging, but rather how sensitive the equipment is to grid frequency conditions. For example, during the system collapse in the 2003 blackout, there were large frequency deviations (+/- 3 Hz) before the over- and under-frequency protection functions tripped the generators. Is there such as thing as over- or under-frequency protection for EV chargers? I know that variations in frequency will affect the performance of the converter and/or the inverter (if a system is allowed to provide energy to the power system). But, I'm trying to see if there is any literature addressing when frequency deviation becomes a problem, if there is such a point for these types of power electronics. For generators, it's usually about 3 Hz from nominal. I'd like to know approximately what the value is for these types of power electronic technologies. It doesn't need to be exact, but I would like a justifiable assumption for it in my simulations.
Again, I don't really understand this stuff, but I think the research relates to future grid stabilization with V2G. As I read the quote above, if the EV chargers are able to handle "extreme" frequency changes, then they should be able to help stabilize the grid. If EV chargers cannot handle "extreme" frequency changes, then they drop out and the grid loses another buffer to failure. I have no idea at what level such frequency changes would be considered "extreme," (e.g., 0.01, 0.1, or 1 Hz) but that may be another part of the research project. It sounds like this is some type of computerized grid simulation. I don't know if this type of information would be included in standard technical specifications, but that would be ideal. Hmmm, maybe somebody at http://www.manzanitamicro.com/ would know. I'll let my friend know about them.
 
Doesn't sound like your friend is asking about frequency dependent charging... just if things are ok when major events happen on the system. There are MW/Hz values for each of the three interconnections in the continental US (East/West/Texas) that while I probably can't show here, are pretty large. Like losing multiple very large power plant large. Usually it's expressed in a tenth of a Hertz. In other words, if we're down to 58 Hz, unless Texas is having serious issues, chances are thousands of grid operators are not having a good day. ;) Again, the frequency band is generally 59.95 to 60.05 Hz. Tripping of generators at say 57 Hz is due to the vibrations/resonances found at those speeds in steam turbines. (Not sure about combustion turbines/jets but I would think not) I know water turbines have essentially no reason to trip at low frequency and in one case a certain utility islanded in the 1950s and "survived" with some load at something ridiculously low like 53 Hz.

For the stations, I have not heard anything about incorporating under-frequency load shedding directly into them. I have heard about cold load pickup randomization where the station will randomly add a few seconds to a minute or two to the time that the station allows the car to resume charging after power is restored. Honestly, reducing cold load pickup is probably the best we could ask for. Incorporating underfrequency load shedding is going to be expensive in a stations' design and since the utility is going to have more accurate and regularly tested equipment, the utilities' relays are what should be relied upon for that purpose.

For the cars: the lower the grid frequency the higher the ripple current that needs filtering before the dc output of the charger. However, since we're in a global market, I believe that the chargers are built to handle worse case on a 50 Hz grid, probably down to 47 Hz. Thus, when we talk about 57-63 Hz, the ripple current is still lower than spec, and the car probably doesn't care. I only mentioned Tesla previously because I believe they measured and acted upon the grid frequency, not that their chargers couldn't handle the variation.

I talked a lot about grid stabilization, storage, and where I think V2G plays a role in a presentation I gave to the Silicon Valley EAA in March this year. You can watch it here: https://youtu.be/3K8bQ6krsSE?t=17m30s If you send me a PM with you or your friend's email, I could send you the slides. To sum it up though: I think managing power flows and load demand are way more financially desirable than frequency regulation.
 
JeremyW said:
For the Leaf and most other EV's, grid frequency is indeed irrelevant. There is power factor correction, sure, but everything is getting rectified to DC, then high frequency AC, then DC again (for isolation).

Jeremy;
First, I don't have an electronics nor power supply background, but always have taken a keen interest in such. So, the dumb question. Why did Nissan design the (3.3kw at least) charger with so many sections? Was it possible to simplify the design by nothing more than AC filtering, PFC/rectification and DC/DC converter sections (massive boost??). This is simply out of curiosity - and wondering what I'm missing?
 
Back
Top