1 Comment

Sounds...about right. I'd caution you against ever including "averages" in your calculations - every utility builds their system on Max Demand+20%. That is what you have to have to ensure 99.99% reliability.

It is worth discussing - why 99.99%? Why not just 99%? Well 99% reliability = 4 days of no power per year, likely all at once. 99.9% = 7-8 hours without power per year. But again, the days without power are likely to happen all at once, so over 10 years, you'll see one period of no power for four days. 99.99% reliability = about an hour a year, or over 10 years, about 8 hours. That is marginally acceptable. Most utilities try to generate at 99.999% reliability. That means your system is down for about 1 hour every 10 years.

Power outages are no joke - they kill people. We try to limit them for that reason. They also cost billions in lost productivity each time they happen. The GDP of Colorado is $420 billion. 4 days no power? That will cost you about $5 billion. August 14–15, 2003 in New York, NY. two days, no power. It caused the death of 90 people.

Niwot Market in Colorado lost power for a single day and cost the grocery store $35,000. By dividing the company’s monetary losses by the number of kilowatt hours (kWh) of electricity that were not consumed, the cost per lost kWh was an astounding $12.21. Normal price per kwh is $0.12/kwh.

It's not a joke, and not something people should be taking casually. Even a few hours of power being down adds up to billions lost. You can't save money by reducing reliability. It is one of the reasons why wind and solar are never actually "cheap" - anything that reduces reliability must be compensated for, often at very high cost, to avoid the even higher cost of a power outage.

Expand full comment