In North Carolina, Solar Energy Becomes Cheaper than Nuclear

Almost everyone who works in the energy world has heard the conventional wisdom that renewable energy simply can’t compete on the basis of cost with coal and other fossil fuels. Sure renewables are nice, this thinking goes, but if you really want to cut carbon emissions you’ve got to invest in something else— like nuclear power. Indeed, one of the nuclear industry’s favorite refrains is that nuclear power plants are the only source of energy capable of replacing fossil fuels cost effectively and on a truly large scale.

It sounds convincing, except that the industry taglines may simply turn out not to be true. A case study from North Carolina now suggests that renewable energy sources like solar power need not be consigned to the sidelines of US energy policy. The authors of a new report from Duke University claim North Carolina electricity generated from photovoltaic arrays has already become less expensive than that produced by nuclear power plants. Solar power’s newfound competitiveness is at least partly due to subsidies intended to encouragement its deployment. Yet this shouldn’t be surprising: the fossil fuel and nuclear industries, which became highly developed a much longer time ago than solar, both benefit from extensive subsidies as well.

North Carolina is not exactly a state famous for renewable energy and sustainable business; if solar can out-compete nuclear power there, it bodes well for other states with even more potential to develop solar technology. Places like Texas and the American Southwest receive more sunshine than North Carolina, and may be able bring down the cost of solar power even more easily if state governments commit to investing in renewables and cutting carbon emissions.

Meanwhile the other major argument against renewables as a replacement for fossil fuels is also being eroded. For years utilities have claimed that wind and solar power are permanently handicapped by their “intermittency”—the fact that the sun doesn’t shine all the time, nor does wind blow constantly. But new developments in technology are challenging this argument as well. The key is to incorporate wind and solar energy into grids that also include non-intermittent renewables like sustainable biomass, as well as a certain amount of natural gas. As advancements in technology make it easier to store energy from wind turbines and solar cells, intermittency will become even less of an obstacle.

As renewable energy sources become more and more cost-competitive, the nuclear industry is acquiring ever more problems. While wind and solar projects have taken off across the country in the last few years, nuclear power has begun to look increasingly expensive, with estimates for the cost of building a new plant pegged as high as $12 billion. If a company begins construction on a nuclear plant and is unable to finish—something that has certainly occurred in the past—all the money and resources invested in construction up to that point are wasted. In contrast, a half finished wind or solar farm is a functioning energy facility half the size of the one originally projected. In that respect renewable power make a much safer investment than nuclear plants.

To the extent that the nuclear industry has attracted a following in the United States, it’s done so largely be convincing voters and policymakers that nuclear power is the only energy source capable of truly replacing fossil fuels on a large scale. Now it looks like renewables may be able to hold their own after all, making risky nuclear power unnecessary. What do you think? Does the United States need to invest in nuclear plants, when solar and other renewable energy sources are becoming cost-competitive?

Photo credit: Wayne National Forest

Nick Engelfried is a freelance writer on climate and energy issues, and works with campuses and communities in the Pacific Northwest to reduce the causes of climate change.