A newly released study exploring the impacts of initiative 1631 by the NERA economic consulting group, funded by the No on 1631 coalition, should not be used by lawmakers or voters when evaluating Initiative 1631. Below, a handful of essential issues with the study are explored. Additional technical flaws exist as well, but are not addressed in this analysis.

Betting Against Renewable Energy

A significant unknown facing climate policy advocates and detractors alike is that rate at which cleaner, greener technologies will become affordable and reliable enough for widespread use. Most models address this by modeling multiple scenarios that try to capture this uncertainty.

The NERA study is underpinned by a significantly conservative forecast of future clean energy prices and deployment, produced by the EIA. The EIA is, in general, regarded as a respectable and transparent outlet for energy-related data. However, their future forecasts have come under severe scrutiny as most of their renewable forecasts in recent years have failed to track with actual reality. As one peer reviewed examination of the EIA forecasts put it, “most of EIA’s projections for renewables sharply under-projected generation or capacity.” You can dig into this more here, but a sharply conservative forecast for future renewables growth will skew every result that follows it, and the failure of the NERA study to also present an ‘optimistic’ case, leads to an imbalanced result. Which leads us to our second concern:

Where is the Sensitivity Analysis?

Another legitimate challenge facing anyone undertaking an examination of a carbon pricing policy like 1631 is having to make a number of assumptions. What will the vehicle fleet look like it 10 years? Will the price reduce demand more than one would expect because of ‘tax aversion’ (this was the case with B.C.’s carbon tax)? Will the board overseeing the investments do a great, okay, or poor job making efficient carbon-lowering investments? The answer to most of these questions is reasonable people disagree. And, a definitive study will arrive at a similar result even if you change some of the key assumptions. Without a sensitivity analysis, we really have no idea whether any of the predictions made by the NERA economists will come true unless every single assumption they made also comes true.

Carbon cuts should increase with time, not decrease

A basic tenant of economics is that in the long run, businesses and consumers respond more to price changes than they do in the short run. It’s easy to imagine this in a carbon pricing context; a household probably will not buy a new vehicle in year 1 of a carbon pricing regime, but they may factor in the change in price of carbon-based energy in year 4 or year 6, when they were already planning to buy a vehicle. A simple explainer of this trend can be found here. The NERA analysis doesn’t adhere to the normal trend, instead projecting a reduction cycle that peaks in 2029 and then gets weaker with time (see page 27). Perhaps there is a good reason for this trend, but it isn’t explained in the study.  

NERA assumes it costs over $1000 to reduce a ton of carbon emissions

Most of the questions about the NERA study we found arise from questions economists and policy makers normally have – how to make the best assumption among imperfect options. However, the NERA study seriously erodes its own credibility in its projection of the impact of the carbon pollution investment board.

Table 7., page 23 of the report lays out a revenue projection for the carbon pollution account. One might quibble that the revenue projection is quite high, but let’s set that aside here. In any scenario, a significant sum of money flows to the carbon pollution account. Let’s take NERA’s very high assumption of approximately 2.5 billion dollars in 2035 alone.

NERA assumes (see page 27) that the investment program itself will generate carbon reductions of 2.6 million metric tons of CO2 between 2020-2035 (the fee incentive will reduce more on top of the investments). It’s not clear where this figure comes from, but our take is that it fails the sniff test. The investment board achieved all of the carbon reduction in 2035 alone, each ton of carbon reduced would cost nearly $1000. But, this is the assumed reduction over 15 years, meaning each ton reduces would cost many, many thousands of dollars in this scenario.

We’ve had concerns about the board too, and would want to see the legislature provide robust oversight of the program. However, California’s experience shows that carbon reduction investments can be found for a much more reasonable price. To date, California’s investment program has funded projects worth 23 million tons of reduced carbon emissions over 4 years. NERA assumes Washington would reduce about 1/10th of that, over 15 years. The NERA projection is untethered from reality. Wonky questions linger about how you count a project that might sequester carbon over 40 years (ie do you get credit for 40 years of reductions upfront?), or whether a project would’ve happened anyway. But, California’s real life experience gives us a data point that discredits NERA’s figures.

Given the myriad of issues with the NERA study, we encourage voters and lawmakers to discard it, and instead look to the real life experience of British Columbia and California to gauge the impact of a carbon pricing program. There are elements of the study that NERA gets right; the coal exemptions are concerning, and that carbon pricing *does* increase gas prices, usually to the tune of $1 per ton to 1 cent per gallon. But, the overall conclusions aren’t valuable to the public dialogue about initiative 1631 and climate action. The study should be discarded by voters and policymakers alike.