Figure 1. Boundary Dam CCS CO2 reductions versus 90% spec (dashed line). 90% is really 77% due to the energy required by the scrubber.
Nukies spend far too much time bashing wind/solar, when they should be bashing tragically over-priced nuclear. If nuclear were anywhere near its should-cost, nobody but a few misanthropic malthusians would be pushing wind/solar for other than a few niche markets. But given where we are, wind/sun is sopping up obscene amounts of rate payer and tax payer money, much of which is going from the less well off to the more well off. Is there a more blatant, more in your face, transfer of wealth from the poor to the rich than a tax credit? It is hard not to rise in opposition to such regressive policies by our defenders of the little man in Washington.
This ripoff is being supported by a whole series of deceptive modelling efforts. Jacobson's transparent fantasies may be behind us; but they have been replaced by far more sophisticated versions of the same story. The end result is always the same: we can have a very high penetration wind/sun grid at little or no additional cost; and it will get us to the Net Zero nirvana.
The first question I have and anyone should have for any grid, which is heavily dependent of wind, sun, and storage is: what is the longest dunkelflaute it can survive? Dunkelflaute is German for extended stretches of low wind and solar. It's a great word.
In May, 2022, Science published a metastudy in which six major models were asked to come up with pathways to the Biden Administration's pledge of reducing US CO2 emissions in 2035 by 50% from 2005 levels.\cite{bistline-2022}1 Table 1 shows the six models that were exercised. What do these six very large, very different models have in common? None of them model dunkelflauten.
None of these models actually march through time, hour by hour, as Jacobson did. Therefore, they can't capture the wind/sun/load correlations that lead to dunkelflauten. Instead they do some form of offline analysis and come up with a ``representative" set of time slices. For example, we might represent the real world with 16 time slices: an hour in each of four seasons and for four different times of day. Table 1 shows the number of time slices per year each model uses. They then somehow come up with a capacity factor for each of these time slices. Usually this is done by studying the load and weather in a single, arbitrarily chosen year.
The optimization program itself knows next to nothing about the real weather. It simply runs through the time slices and comes up with the min cost capacities that meets the load in all time slices, while not violating the user imposed CO2 constraint. The hope is the slice capacity factors are small enough that the resulting grid will handle the worst case dunkelflaute.
The motivation behind this kluge is that we can reduce the number of time slices from 8760 per year to a far smaller number. This allows us to model a much larger number of decision variables. We can have a sizable number of regions with flows between those regions. OK, but at a minimum, as part of the postprocessing, you must check your grid against a long sample of actual hourly data. As far as I can tell, only LBNL did this, running their grid against 7 years of actual data. They were happy with the results; but we still don't know what is the longest dunkleflaute their grid can handle.
All these studies have two other things in common.
No fixed (aka embedded) CO2 emissions. So while they install massive amounts of resource intensive wind and solar, considerable batteries, lots of electrolysers, an immense amount of carbon capture equipment and CO2 pipelines, none of the CO2 emitted in producing or maintaining this vast infrastructure is acknowledged. Net Zero is not net zero. Thank God. Will Rodgers once said ``Well, at least Prohibition is better than no liquor at all." Something similar could be said about Net Zero.
Unlike Jacobson, all these models use a lot of fossil, mostly gas. All the models take a panglossian view of CO2 Capture and Storage (CCS). In 2020, Princeton used the RIO model to come up with five pathways to Net Zero by 2050. The Princeton pathways require the capture and geologic storage of 0.9 to 1.7 gigatons of CO2 per year. This will require 110,000 km of high pressure (2250 psi) pipelines.
The density of CO2 at pipeline pressure is 0.8 tons/m3. We will need to capture, compress, transport, and inject 1.1 to 2.1 billion cubic meters of CO2 year. The US currently consumes about 20 million barrels per day of oil. That is (365 / 6.29) * 20e6 = 1.16 billion cubic meters per year. We will need an industry and infrastructure that is the same size as the current oil production and transmission system, not to produce and distribute something that is valuable; something people are willing to pay $80/barrel for; but to dispose of a waste product in a dangerous manner.
Of course, that assumes that we can actually capture the CO2 economically. What is the current state of carbon capture technology? There are exactly two commercial scale, carbon capture power plants operating on the planet:2
1) Boundary Dam 3 at Estevan, Saskatchewan.
2) Petro Nova at Thompsons, Texas.
Both are scrubbers at coal plants. Both have a target of 90% CO2 recovery. The 90% is misleading. Actually, it's a lie. The amine scrubbing process requires heating up the solvent to release the CO2. That takes energy. To support the scrubber, Boundary Dam 3 went from 140 MW net plant to a 120 MW net plant. In terms of reduction per unit of grid electricity, 90% really means (120 / 140) * 0.9 = 0.77.
Neither have achieved that target, mainly due to outages of one form or another. Figure 1 shows the performance of the Boundary Dam scrubber. We don't yet have an example of a carbon capture plant that has performed to spec. Getting to 90% (really 77%) on a gas plant will be more difficult due to the more dilute stack gas. We don't yet have an example of any carbon capture at a gas plant, except possibly the abject failure at Kemper. This is the foundation on which in 25 years we will build an industry as large as the current oil industry.
Jacobson's Net Zero fantasies may be behind us; but the current replacements are nearly as unrealistic.
This pledge is more than a little misleading. 2005 was the peak in US CO2 emissions. Since then emissions have dropped about 19% mainly due to coal to gas switching. This was due to the totally unexpected arrival of fracking, a development that public policy had almost nothing to do with and which the Net Zero'ers have vehemently opposed. The 2021 pledge really calls for a 30% decrease from the 2021 level.
The Kemper coal gasification and precombustion capture never really worked. It was shut down and demolished in 2017 with nothing to show for 7.5 billion dollars. If Kemper had performed to spec, it would have captured 65% of the CO2 created by the plant.
I was just talking about Boundary Dam yesterday. The fishing there is excellent as the 'waste heat' from the power plant prevents a hard freeze of the lake in winter
It's a very good point that arguing for affordable nuclear is a lot bigger deal than against wind and solar. If a high renewables grid is 50% cheaper or 50% lower carbon with some expensive nuclear icing on it, it's not much of an accomplishment
The Sask Energy utility owning the same Boundary Dam site projects net zero electricity alone to be 170% more expensive, including BWRX-300 nuclear SMRs. Should cost nuclear is the only serious option
I agree with most of what you have said.
However, I do have some experience in the design of amine systems for SO2 and I believe the CO2 systems are very similar. The systems should be easier to operate on a gas plant, lower CO2 concentrations in the off-gas are not a major issue, it is just a matter of sizing the equipment correctly.
The off-gas from a gas plant is cleaner than a coal plant (fewer particulates). It will mean less fouling of equipment, which has been a problem at Boundary Dam, and less deterioration of the amine. It should work better with gas than coal.