How Hard is it to Integrate Renewable Energy into the Electric Grid?
- Jul 4, 2013 2:00 am GMT
- 1045 views
- Not all megawatt-hours (MWh) are created equal. The value of a MWh depends on location, dispatchability, and generator technology.
- While not all MWhs are equal, we typically reward renewable and distributed generators based solely on $/MWh payments that ignore the differences of location, dispatchability, and technology.
- It’s time to change the structure by which non-utility generators are paid to properly value the attributes of different MWhs.
- Getting the incentives right will make integrating large amounts of intermittent wind and solar into the grid much easier.
In January, EEI said [PDF] that the incentives paid to renewable energy are going to jeopardize grid reliability and electricity costs. Many enviros have responded that this represents nothing more than the utility industry’s naked self interest. Who’s right?
I’d suggest neither. The true problem with the EEI report, as I pointed out here, is that they are confusing (perhaps intentionally) a pricing structure problem with a technology problem. There are technical issues, but they result from the structure of clean-energy incentives, not the technologies or the total volume of the incentive.
So there’s an easy solution: change the structure by which non-utility generators are paid. This is as logical as it is politically impossible. Utilities (like all businesses) are loath to set prices for the benefit of their competitors, and enviros are loath to concede that tax credits and renewable portfolio standards, won after hard-fought battles, might have structural flaws.
But it’s gotta happen. We have to decarbonize the grid as fast as possible without compromising cost or reliability — and the current status quo won’t get us there.
Here, then, is my effort to outline the key technical issues and how better economic signals might work. First, a few electricity basics you may remember from high-school physics:
- Power = energy divided by time. A watt-hour is a unit of energy. A watt-hour per hour is a watt, which is a unit of power.
- Electric devices need power and energy in precise volumes. Lighting a 60 watt bulb for 8 hours requires 60 x 8 = 480 watt hours — but if you try to zap the bulb with 480 watts for one hour, or dribble in one watt for 480 hours, it’s going to be dark.
- Power (watts) = voltage x current.
- Electric devices also need voltage and current at precise levels.
- Voltage = current x resistance.
- Electric power systems can be designed with “direct” or “alternating” current (this refers to whether current moves in one direction or oscillates). The modern electric grid uses alternating current (“AC”).
Now, let’s look at what follows from these six points.
First, energy storage and generation are complementary, not competitive, technologies. They can work well together, but neither is particularly good at doing the other’s job.
As a general rule, energy storage is a cheap source of power and an expensive source of energy, while energy generation is a cheap source of energy and an expensive source of power. This makes storage very good at providing short bursts and generation good at keeping the lights on. That’s why you use a battery to start your car but a gas engine to keep it going.
The time-dependent nature of energy and power makes this true even for technologies that haven’t been invented yet. The physical size of an energy storage technology is a function of how much energy it stores while the physical size of a generator is a function of its peak power output. Increasing physical size = investing more capital which is always expensive in time and dollars. By contrast, incremental decisions to open up a throttle a little wider are cheap, operating-level decisions. For energy storage, opening the throttle increases the energy extracted per unit time = power. For energy generation, opening the throttle brings in a little more fuel and generates more energy.
As a result, the lowest cost way to run a power grid will always be to maximally use energy storage technologies to meet peak power needs and maximally use generation technologies to meet energy needs. Energy storage can play a bigger role on the grid, but it won’t eliminate the need for a “just in time” linkage between generation and load.
This leads to a second issue: the grid must maintain and control generation that can instantaneously ramp up and down in response to changes in load. This informs generator technology selection and contract structure, in the sense that both are necessary, but independently insufficient.
Historically, this led grid managers to monitor load and direct generators to ramp up or down in response to load variation. As wind and solar penetration has risen, grid managers now also need to take into account sudden increases or decreases in power output from these weather-dependent generators. The result is to add complexity to grid operation in excess of that contemplated by current control/contracting schemes. With its high penetration of wind and hydro, the Pacific Northwest is at the cutting edge of these challenges. So far, they’ve developed lots of workarounds, from dumping “excess” power into resistor banks to filing lawsuits that challenge existing contracts, but they’ve not yet found a long-term, viable solution.
Third, maintaining grid reliability requires precise synchronization of voltage and current. Since power = volts x amps and since current (amps) oscillates in an AC system, voltage has to oscillate in precisely the same way. To see this visually, first consider a 60 amp, 120 volt circuit operating in perfect synchrony.
Watts = volts x amps, so power peaks at (120 V x 60 A) = 7,200 W (7.2 kW).
Now look at the same circuit, but with the current slightly out of phase with the voltage:
The “real” power that comes out of this circuit is still the product of volts and amps, but since the volts and amps now peak at different times, we’re getting less power than we were before (6.7 kW in this example). The ratio of actual power to theoretical power is the “power factor,” and typically runs between 85-95%. As power factor falls, generators still make the same amount of power and burn the same amount of fuel, but less gets to the load, so the effect is to lower system-wide fuel efficiency.
Motors, capacitors, and other electrical devices cause current to shift out of phase with voltage, so power factor degradation is unavoidable and grid managers must take actions to correct. The most effective way to correct is with power plants that are sited near the load and use spinning generators that can maintain constant frequency but independently shift current and voltage to offset grid degradation. This is pretty easy with any power plant that naturally spins at 2000–7000 rpm. As it turns out, this is exactly the speed that steam turbines, reciprocating engines, and gas turbines normally operate.
Unfortunately, lots of emerging generation technologies would prefer to run at lower speeds (windmills), higher speeds (microturbines), or don’t spin at all (solar panels, fuel cells). That’s not a particularly big deal, except that as these new technologies serve an ever greater portion of the load, it gets relatively harder to maintain high system power factors.
Finally, location matters. The resistance of a wire is a direct function of the length of the wire. Since voltage is the product of current and resistance, the more wires that separate a generator and the load, the greater the current (and therefore, energy) loss through that wire for any given voltage. These line losses typically run 3-5% on average, but increase dramatically during peak periods when wires are congested, often exceeding 20%. This means not only that we have to burn more fuel to generate the same amount of useful energy, but also that we must over-invest in the power generation capacity of any system with a preponderance of remote generators.
The bottom line is that not all MWh are equal:
- It takes fewer MWh of generation to serve a MWh of load if that MWh is generated near the load.
- The ability to produce (or curtail) peak power output at a moment’s notice is valuable regardless of actual MWh generated.
- 1 MWh from a generator that can boost system power factor is worth more than 1 MWh to the system. The reverse is also true.
A key point is that none of these values depend on MWh output, nor do they depend on the fuel used or power plant ownership. They depend solely on location, dispatchability, and generator technology. And yet most of the ways the rules reward generators are with $/MWh payments (PTCs, RECs, etc.) that are a function of fuel use and whether or not your ownership structure allows you to monetize tax attributes.
Which means that EEI is right when they say that current incentives for renewable energy are leading to sub-optimal capital allocation. But that’s not because renewable energy is a dangerous thing — it’s because it’s compensated in the wrong way.
In the course of putting a premium on clean energy to try and monetize externalities, we’ve created a set of economic incentives that don’t map very well against the economics of grid operation. That’s a fixable problem — but only if we first admit that there are legitimate technical issues that can be addressed with better economic signals.
Price it right, and they will come.
This would be an ideal first initiative for new Federal Energy Regulatory Commission Chairman Binz…