WetEV wrote:GRA wrote:Note that utilities often list sites as so many (k/M/G/) Watts of storage, because they are concerned with meeting power demand, so it's not clear to me if the "1,000 MW" (i.e. 1 GW) number should be MWh or not.
There is both energy storage in MWh, and a rate you can take it out at in MW. Both matter.
If the primary purpose is network stabilization, then the MW rating is more important. If the primary purpose is energy storage, then the MWh rating is more important.
Of course, but I was trying to prevent the almost inevitable posts decrying the use of MW instead of MWh. Utilities list Watts instead of Watt-Hours for a reason, not because they don't know the difference. It could just be a typo, but the source says "Initially developing enough energy storage to completely serve the needs of 150,000 households for an entire year", which works out to an average of 6.67 kW draw/household for an entire year i.e. * 8,766 hours, * 150,000 households equals 877 TWh, if I haven't misplaced a decimal somewhere. That assumes a constant 1 GW draw and no replenishment over that period, so actual storage could be considerably less, and of course usage would vary widely during the day. The important thing is that they're talking about storage that can supply up to 1GW of power for a prolonged period of time, and do so economically (we'll have to see about that).
California has a requirement for utilities to have 1.3GW of storage by 2020, although these are for peaking/reliability rather than long term storage:
https://www.energy.ca.gov/renewables/tr ... torage.pdf
Here's part of that, using batteries:
Storage will replace 3 California gas plants as PG&E nabs approval for world's largest batteries
https://www.utilitydive.com/news/storag ... rl/541870/