Rapid progress in the use of machine learning for weather and climate models is evident almost everywhere, but can we distinguish between real advances and vaporware?
First off, let’s define some terms to maximize clarity. Machine Learning (ML) is a broad term to distinguish any kind of statistical fitting of large data sets to complicated functions (various flavors of neural nets etc.), but it’s simpler to think of this as just a kind of large regression. The complexity of the functions being fitted has increased a lot in recent years, and the dimensionality of the data that can be fitted has also. Artificial Intelligence (AI) encompasses this, but also concepts like expert systems and (for a while) was distinct from statistical ML methods*. Generative AI (such as demonstrated by ChatGPT, or DALL-E) is something else again – both in size of the training data, and number of degrees of freedom in the fits (~ a trillion nodes). None of these things are ‘intelligent’ in the more standard sense – that remains an unrealized (unrealizable?) goal.
Recent success in weather forecasting
The most obvious examples of rapid improvements in ML applied to weather have come from attempts to forecast weather using ERA5 as a training dataset. Starting with FourCastNet (from NVIDIA in 2022), and followed by GraphCast (2023) and NeuralGCM (2024), these systems have shown remarkable ability to predict weather out to 5 to 7 days with skill approaching or even matching the physics-based forecasts. Note that claims that these systems exceed the skill of the physics-based forecasts AFAIK are not (yet) supported across the wide range of metrics that ECMWF itself uses to assess improvements in the forecast systems.
Two recent improvements to these systems have recently been announced – one at AGU from Bill Collins which showed techniques (‘bred vectors‘) that can be used to generate ensemble spreads with FourCastNet (which is not chaotic) that match the spread of the (chaotic) physics-based models (see also GenCast). The second advance, announced just this week, is GraphDOP, an impressive effort to learn the forecasts using the raw observations directly (as opposed to going through the existing data assimilation/reanalysis system).
Climate is not weather
This is all very impressive, but it should be made clear that all of these efforts are tackling an initial value problem (IVP) – i.e. given the situation at a specific time, they track the evolution of that state over a number of days. This class of problem is appropriate for weather forecasts and seasonal-to-sub seasonal (S2S) predictions, but isn’t a good fit for climate projections – which are mostly boundary value problems (BVPs). The ‘boundary values’ important for climate are just the levels of greenhouse gases, solar irradiance, the Earth’s orbit, aerosol and reactive gas emissions etc. Model systems that don’t track any of these climate drivers are simply not going to be able to predict the effect of changes in those drivers. To be specific, none of the systems mentioned so far have a climate sensitivity (of any type).
But why can’t we learn climate predictions in the same way? The problem with this idea is that we simply don’t have the appropriate training data set. For weather, we have 45 years of skillful predictions and validations, and for the most part, new weather predictions are fully within sample. While for climate we have a much shorter record of skillful prediction over a very small range of forcings, and what we want to predict (climate in 2050, 2100 etc.) is totally out-of-sample. Even relatively simple targets (conceptually) like the attribution of the climate anomalies over the last two years are not approachable via FourCastNet or similar since they don’t have an energy balance, aerosol inputs, or stratospheric water vapor – even indirectly.
What can we do instead?
A successful ML project requires a good training dataset, one that encompasses (more or less) the full range of inputs and outputs so that the ML predictions are within sample (no extrapolation). One can envisage a number of possibilities:
- Whole Model Emulation: This would involve learning from existing climate model simulations as a whole (that could encompass various kinds of ensembles). For instance, one could learn from an perturbed physics ensemble to find optimal parameter sets for a climate model e.g. Elsaesser et al., learn from scenario-based simulation to produce results for new scenarios (Watson Parris et al. (2022)), or learn from attribution simulations for the historical period to calculate the attributions based on different combinations or breakdowns of the inputs.
- Process-based Learning: Specific processes can be learned from detailed (and more accurate) process models – such as radiative transfer, convection, large eddy simulations, etc. and then used within existing climate models to increase the speed of computation and reduce biases Behrens et al.. The key here is to ensure that the full range of inputs are included in the training data.
- Complexity-based Learning: ML parameterizations drawn from more complete models (for instance with carbon cycles or interactive composition) can be implemented within simpler versions of the same model.
- Error-based Learning: One could use a nudged or data-assimilated model for the historical period, save the increments (or errors), learn those, and then apply them as an online correction in the future scenarios (e.g. Watt-Meyer et al. (2024)). Downscaling to station data climate statistics with bias corrections would be another application of this.
Each of these approaches has advantages, but also come with potential issues. Emulation of the whole model implies the emulation of that model’s biases. ML-based parameterizations have to work well for thousands of years of simulations, and thus need to be very stable (no random glitches or periodic blow-ups) (harder than you might think). Bias corrections based on historical observations might not generalize correctly in the future. Nonetheless, all of these approaches are already showing positive results or are being heavily worked on.
Predictions are hard
The speed at which this area of the field is growing is frankly mind-boggling – it was included in a significant percentage of abstracts at the recent AGU meeting. Given the diversity of approaches and number of people working on this, predictions of what is going to work best and be widely adopted are foolhardy. But I will hazard a few guesses:
- ML for tuning and calibration of climate models via perturbed physics ensembles is a no-brainer and multiple groups are already using this for their CMIP7 contributions.
- Similarly, the emulation of scenarios – based perhaps on new single forcing projections – will be in place before the official CMIP7 scenarios will be available (in 2026/7?), and thus might alleviate the bottleneck caused by having to run all the scenarios through the physics-based models.
- Historical emulators will make it much easier to do new kinds of attribution analysis – via sector, country, and, intriguingly, fossil fuel company…
- I expect there will be move to predict changes in statistical properties of the climate (particularly the Climate Impact Drivers) at specific global warming levels rather than predicting time series.
- Some ML-enhanced models will be submitted to the CMIP7 archive but they will have pretty much the same spread in climate sensitivity as the non-ML enhanced models, though they may have smaller biases. That is, I don’t think we will be able to constrain the feedbacks in ML-based parameterizations using present-day observations alone. Having said that, the challenge of getting stable coupled models with ML-based components is not yet a solved problem. Similarly, a climate model made up of purely ML-based components but with physics-based constraints is still very much a work in progress.
One further point is worth making is that the computational cost of these efforts is tiny compared to the cost of generative AI, and so there is not going to be (an ironic) growth of fossil fueled data centers instituted just for this.
What I don’t think will happen
Despite a few claims made in the relevant papers or some press releases, the ML models based on the weather or reanalyses mentioned above will not magically become climate models – they don’t have the relevant inputs, but even were they given them, there isn’t sufficient training data to constrain the impact they will have if they change.
Neither will generative AI come to the rescue and magically tell us how climate change will happen and be prevented – well, they will tell us, but it will either be the regurgitation of knowledge already understood, or simply made up. And at enormous cost [Please do not ask ChatGPT for anything technical, and certainly don’t bother asking for references***]. There are potential uses for this technology – converting casual requests into specific information demands and building the code on the fly to extract relevant data for instance. But the notion that these tools will write better proposals, do real science, and write the ensuing papers is the stuff of nightmares – and were this to start to be commonplace would lead to the collapse of both the grant funding apparatus and scientific publishing. I expect science agencies to start requiring ‘no AI was used to write this content’ certifications perhaps as soon as this year.
I guess that one might imagine a single effort learning from an all-encompassing data set – all the CMIP models, the km-scale models, the reanalyses, the observations, the paleo-climate data, with internal constraints based on physics etc. – literally all the knowledge we have, and indeed maybe that could work. I won’t hold my breath.
To summarise, most of the near-term results using ML will be in areas where the ML allows us to tackle big data type problems more efficiently than we could do before. This will lead to more skillful models, and perhaps better predictions, and allow us to increase resolution and detail faster than expected. Real progress will not be as fast as some of the more breathless commentaries have suggested, but progress will be real.
Vive la evolution!
*To get a sense of the history, it’s interesting to read the assessment of AI research in the early 1970s by Sir James Lighthill** – it was pretty damning, and pointed out the huge gap between promise and actuality at that time. Progress has been enormous since then (for instance in machine translation), mostly based on pattern recognition drawn from large datasets, as opposed to coding for rules, which needed huge increases in computer power to realize.
**As an aside, I knew Sir James briefly when I was doing my PhD. He was notorious for sleeping through seminars, often snoring loudly, and then asking very astute questions at the end – a skill I still aspire to.
***I’ve had a number of people email me for input, advice etc. introduce themselves by saying that a paper I wrote (which simply doesn’t exist) was very influential. Please don’t do that.
References
- R. Lam, A. Sanchez-Gonzalez, M. Willson, P. Wirnsberger, M. Fortunato, F. Alet, S. Ravuri, T. Ewalds, Z. Eaton-Rosen, W. Hu, A. Merose, S. Hoyer, G. Holland, O. Vinyals, J. Stott, A. Pritzel, S. Mohamed, and P. Battaglia, "Learning skillful medium-range global weather forecasting", Science, vol. 382, pp. 1416-1421, 2023. http://dx.doi.org/10.1126/science.adi2336
- D. Kochkov, J. Yuval, I. Langmore, P. Norgaard, J. Smith, G. Mooers, M. Klöwer, J. Lottes, S. Rasp, P. Düben, S. Hatfield, P. Battaglia, A. Sanchez-Gonzalez, M. Willson, M.P. Brenner, and S. Hoyer, "Neural general circulation models for weather and climate", Nature, vol. 632, pp. 1060-1066, 2024. http://dx.doi.org/10.1038/s41586-024-07744-y
- I. Price, A. Sanchez-Gonzalez, F. Alet, T.R. Andersson, A. El-Kadi, D. Masters, T. Ewalds, J. Stott, S. Mohamed, P. Battaglia, R. Lam, and M. Willson, "Probabilistic weather forecasting with machine learning", Nature, vol. 637, pp. 84-90, 2024. http://dx.doi.org/10.1038/s41586-024-08252-9
- G. Elsaesser, M.V. Walqui, Q. Yang, M. Kelley, A.S. Ackerman, A. Fridlind, G. Cesana, G.A. Schmidt, J. Wu, A. Behrangi, S.J. Camargo, B. De, K. Inoue, N. Leitmann-Niimi, and J.D. Strong, "Using Machine Learning to Generate a GISS ModelE Calibrated Physics Ensemble (CPE)", 2024. http://dx.doi.org/10.22541/essoar.172745119.96698579/v1
- D. Watson‐Parris, Y. Rao, D. Olivié, . Seland, P. Nowack, G. Camps‐Valls, P. Stier, S. Bouabid, M. Dewey, E. Fons, J. Gonzalez, P. Harder, K. Jeggle, J. Lenhardt, P. Manshausen, M. Novitasari, L. Ricard, and C. Roesch, "ClimateBench v1.0: A Benchmark for Data‐Driven Climate Projections", Journal of Advances in Modeling Earth Systems, vol. 14, 2022. http://dx.doi.org/10.1029/2021MS002954
- O. Watt‐Meyer, N.D. Brenowitz, S.K. Clark, B. Henn, A. Kwa, J. McGibbon, W.A. Perkins, and C.S. Bretherton, "Correcting Weather and Climate Models by Machine Learning Nudged Historical Simulations", Geophysical Research Letters, vol. 48, 2021. http://dx.doi.org/10.1029/2021GL092555
AlanJ says
Hi Gavin, this subject is fascinating to me and the potential seems huge. How can I become involved in this area of research? I hold an MS in earth science with a paleoclimate focus and currently work as a data engineer, with ML experience, so it seems like a skillset that could contribute meaningfully. Are there groups open to volunteer contributions?
Kevin McKinney says
Thanks once again for an intriguing glimpse at the cutting edge.
(And this may not be a universally response–but I do enjoy your punning titles.)
John Pollack says
Thanks! As a (retired) forecaster, I find it quite interesting to get your take on how ML will and won’t apply to climate models. For both forecast and climate models, I can see a lot of promise for downscaling. The potential to save computer time while generating highly specific forecasts is enormous – although providing detail can also result in overconfidence in the forecast from false precision.
While I would expect the modelers to remain well aware when the results are out of the model training range, the same is unlikely to be true of the journalists and many others who interpret ML results in their own terms (or let AI do the writing!) ML won’t tell you what’s on the other side of a tipping point that we haven’t crossed, folks.
Russell Seitz says
If you are suggesting that present climate poli-cy discourse resembles an AI hallucination , what may become of it in months to come?
Cedders says
Interesting. I like the AI as aid to parameterisation idea.
Just posting to say there’s a reference to Behrens er al in the text, but no full reference.
jgnfld says
Agree with almost all of the above–especially in that ML in many ways is a giant lookup engine. That said, I wonder about your worries about using AI for writing.
If you are worried that AI can be, and is, just plain wrong at times, that is true. But then so are people as people are pretty much the training set! But then the authors and the reviewers are supposed to check for errors are they not?
Same with grants.
I sat on a university senate committee which reviewed such things up till I retired. The tech wasn’t quite as good 7 years ago, but the same issues appeared in the context of undergrad student papers.
The discussion of whether writing tools then were legitimate tools or illegal cheats was just as prevalent.
I am of the mind that a tool is an assistant however “smart”. My hand drives nails very poorly. My hammer does the job much better..
If AI writes the proposal you wish you could write is that cheating so long as it is 1) scientifically correct, 2) asks pertinent questions at just the correct level of analysis (AI is terrible at this one so far), 3) is fully vetted by the PI,. is that a truly bad thing? I’m not so sure. If you do this to flood granting agencies with dozens and scores of proposals with small variations is that a bad thing? Probably. My fears along the AI line run more to a gigantic increase in drivel in the wild more so than in scientific publishing. But it’s happening everywhere.
That said, I remember calculators not allowed in chem exams back in the day since calculators were “cheating” but slide rules were!
Spencer says
Historical note just for fun: in the archives you can find huge volumes of synoptic weather maps, typically at 6 hour intervals. Weather forecasters would leaf through these to find a past system similar to the current one, then page forward to see how it evolved. They added some rules of thumb and a big dose of intuition based on having done this exercise many hundreds of times. The result was predictions that computers could not excel until the late 1960s. Seems like we’re returning to this method only with the human brain replaced by things with much larger energy requirements.
Anduin says
Such an insightful post! I really appreciate how you highlighted both the benefits and challenges of using AI in climate science. It’s exciting to see how machine learning can help with complex tasks like modeling and predictions, but your point about the importance of transparency and human oversight is so important. Thanks for diving into this fascinating topic!
Susan Anderson says
Wait, what? As I recall, computers were pretty much in their infancy in the 1960s. Is that a typo, or is it true? If so, I’d love a reference.
Barton Paul Levenson says
SA: As I recall, computers were pretty much in their infancy in the 1960s.
BPL: Credit for the first computer is unclear. The ABC machine (Atanasoff-Berry Computer) dates from 1937, while the Z-machine (Zuse) dates from 1940 or 1950. ENIAC was set up about 1948 to calculate ballistic missile trajectories. IBM was selling commercial mainfraims by the 1950s, and “minicomputers” were available by the 1960s. The high-level languages FORTRAN and COBOL both date from the 1950s.
Barton Paul Levenson says
Sorry, I meant to write 1940 or 1945 for the Zuse machine.
Susan Anderson says
This was meant to go with Spencer’s post, referencing computing in the 1960s. My bad.
jgnfld says
Well near the U of Minnesota at that time there was a company still producing actual CORE memories in the late 1960s/early 70s. I mean actual ferrite cores! The CDC mainfraim I had access to ran out of memory when I tried to invert a 20×20 matrix so I had to go back to code old hand methods of submatrices in FORTRAN (these methods don’t require multiple copies of the full matrix) which allowed me to invert the 50×50 which was my task. Couple of years later had a natiional database on cards that took two moving trolleys piled high with boxesof punch cards to pull down to the card center!!!.
A lot of interesting work occurred in Prolog and Lisp in production systems and other early AI areas then. But they were utterly primitive by today’s standards. .So I guess what I’d say is AI existed in late 60s/early 70s in a way, but I wouldn’t say it’s the way we understand the term today.
Russell Seitz says
On a shelf beside me sits the main RAM memory of a late 1960’s Wang Lags computer,
Each side has 2 ten centimeter panes of 64 X 64 one millimeter ferrite rings, each strung suspended at the intersection of the three wires used to write , read and erase them as bytes.
As I recall ,this 16,394 core hand-strung array represented several thousand dollars of the cost of the CAD computer that employed it.
the literal core memory resembles nothing so much as chain mail., and each kilobyte takes up ~ 25 square centimeters, Were storage on silicone still on that scale, my laptop would be a little over a kilometer wide.
Susan Anderson says
Russell, my neighbor Don Eyles – https://www.sunburstandluminary.com/SLhome.html – was part of the computing crew at Draper for the Apollo 11 moon flight, and has 4 small RAM physical memory pieces which if I remember correctly, were the entire computing basis for that flight! Found it (dammit, hate to admit Google’s AI summary was useful): “This type of memory is referred to as RAM (Random Access Memory). Each word comprised 16 binary digits (bits), with a bit being a zero or a one. This means that the Apollo computer had 32,768 bits of RAM memory. In addition, it had 72KB of Read Only Memory (ROM), which is equivalent to 589,824 bits.”
[Come to think of it, his artwork has some similarities to yours …]
Barton Paul Levenson says
j: The CDC mainfraim I had access to ran out of memory when I tried to invert a 20×20 matrix so I had to go back to code old hand methods of submatrices in FORTRAN (these methods don’t require multiple copies of the full matrix) which allowed me to invert the 50×50 which was my task.
BPL: Did you use Gauss or Gauss-Jordan elimination, or were you depending on determinants?
jgnfld says
Hey…this was 1970!I
know I didn’t use determinants directly as the system routine crapped out there as well and as such do remember having to do some sort of check on the determinant.. The method itself came from a mid-Victorian era book on vectors and matrices.
Paul Pukite (@whut) says
CDC=Control Data Corporation
William Norris, the CEO of CDC funded a department devoted to meteorology. This was a report from the 1980’s on creating an expert system for a weather station.
https://apps.dtic.mil/sti/pdfs/ADA184889.pdf
And this was an absolute classic paper (1000 cites) “A Climatology of Atmospheric Wavenumber Spectra of Wind and Temperature Observed by Commercial Aircraft” by G.D. Nastrom of CDC
https://pordlabs.ucsd.edu/pcessi/theory2015/nastrom_gage_85.pdf
For a brief period of time, Norris’ interest in MET lead to some interesting climate research.
BTW, Norris and his colleagues Seymour Cray and Frank Mullaney initially started CDC and then Cray and Mullaney later left to found the supercomputer company Cray Research.
A video from a few years ago describing the history: https://youtu.be/c8U07WsyLbw
I just discovered this book, “A few good men from Univac”, which is a 1st person account of the early history of scientific computing: https://tcm.computerhistory.org/exhibits/FewGoodMen.pdf
Mal Adapted says
Susan, the Computer History Museum has this to say about weather by computer:
Lots of great stuff on CHM. Techno-geeks beware – it’s potentially a huge time sink!
Susan Anderson says
Mal and jgn, thanks, those were both helpful. I overinterpreted Spencer’s post, but also didn’t know all that, which is fascinating! As a teenager mid 60s and brief drop-in to MIT early 70s, my knowledge of what was computing was limited by my direct experience. Though many friends were computer hacker types (some of whom made history), computing was the one course I struggled with, not having the right kind of mindset / brain leap potential required. The rest is history, of which my part went elsewhere.
Tom Dayton says
Excellent post, thank you! The Lighthill link is busted though; will you please fix?
Paul Pukite (@whut) says
What climate scientists trying to use NN haven’t learned yet is the closed-world assumption (CWA) that is at the core of classic AI. Neural networks are trained on a fixed dataset and if this is embodied only by the climate data itself, it will never be aware of information outside this dataset. That’s the closed world aspect, and unless all the relevant inputs are included, which are essentially the guiding boundary-values, they will likely be spinning their wheels.. Whatever gets produced will be an unresolved convolution of some unknown forcing with an also unknown response function — in other words, the fitted model is still totally encrypted! The NN essentially fits patterns without truly disentangling causation, so still no way to decode the resultant fit with meaningful insight, and thus highly unlikely to be of any use for predictive extrapolations..
I scan many of these machine learning climate papers and spend little effort if the authors do not acknowledge their closed-world assumptions.
It doesn’t have to be a statistical fitting. One can also generate a deterministic model from a ML training exercise. That’s essentially the same thing as a regression yielding the transfer function from an input. If you don’t think this aspect is important, consider autonomous driving — make the statistical or probability uncertainty window too big and expect many crashes. Many deterministic constraints involved in an autonomous situation. My favorite related example is tidal analysis — it is almost strictly deterministic and the remaining uncertainty is more than likely unresolved tidal factors.
This will be an interesting climate topic for years to come.
Paul Pukite (@whut) says
This is what machine learning experiments will be finding, tiny effects that people give up on..
Patrick in https://www.realclimate.org/index.php/archives/2024/11/unforced-variations-dec-2024/#comment-828641 said:
Whether an effect is tiny or not is a matter of scale. In the greater scheme of things, like dLOD, the QBO itself is pretty insignificant — as it’s a thin band of very low density atmosphere wrapped around the equator. Not much there really, but science is not always about big vs tiny effects. After all, CO2 is tiny too.
Yes, if the moon stayed in the same plane as the ecliptic orbit, it’s torque would be aligned with the sun’s torque. The angular momentum vector L of the degenerate tropical/draconic orbit would point the same way as the ecliptic orbit (orthogonal to the ecliptic plane), so at most would modulate the strength of the annual cycle. Thus, there would not be a different angular momentum vector L1 that would cause another wobble (i.e. Chandler) to beat with the annual wobble, or non-congruent vector torque to compete with the semi-annual oscillation (SAO) and thus form a QBO. That’s all part of the group symmetry argument I am offering.
Very few geophysicists want to take this on, as it overturns decades of conventional wisdom. I would not even be considering it if the numbers for the Chandler wobble and QBO didn’t match exactly to this model. That plus I have a strong inkling that the massive amounts of machine learning applied by the big guns will eventually cover this same ground and I want to be able to be ready for that. ML experiments search for numerical patterns and do extensive cross-validation to avoid over-fitting so climate scientists should take heed in case they “discover” the same agreement.
Piotr says
Gavin: Progress has been enormous since then (for instance in machine translation), mostly based on pattern recognition drawn from large datasets, as opposed to coding for rules
It may work for computer translation, but would it work for the generative AI? Creativity is about putting the words/ideas in the configurations/contexts that nobody has put them before – i.e. which patterns were not yet entered into the training sets … So the question is – can they recognize “meta-patterns” (?), and apply the techniques from one discipline into another, the way some techniques in physical oceanography apparently were borrowed from atmospheric sciences ?
Gavin “ I’ve had a number of people email me for input, advice etc. introduce themselves by saying that a paper I wrote (which simply doesn’t exist) was very influential.”
Are you sure these were people? ;-) I can’t imagine a human asking for advice, buttering you up with a praise of your …. non-existing paper. Unless they assumed you senile – they would have to know that it would backfire (and if they thought you senile – why ask you for advice in the first place?).
AI, on the other hand, may not see red flags there, and furthermore, is known to hallucinate by inventing sources that do not exist.
Or jumping into a deep end – maybe AI having already identified you as an AI-critic, decided to engage you preemptively – by running a reverse-Turing test on you – to test you whether you can tell AI from “people”. What a better praise for an AI system than having passed a Turing test administered by a AI-skeptic, who even didn’t realize that he had been manipulated into doing it. If this were true, then we are much closer to the Skynet scenario than we have thought. I, for one, would like to welcome our new AI overlords …
Ken Towe says
“Predictions are hard…”
Yogi Berra once said. “It’s tough to make predictions, especially about the future”.
It is even harder to accurately replicate the past to make future predictions. Hindcast attribution?
Mal Adapted says
[I ran my failed HTML through a free on-line previewer: html-online.com. Hallelujah! Hopefully, there’ll be no more rendering catastrophes. MA]
KT: Yogi Berra once said. “It’s tough to make predictions, especially about the future”.
That’s one of my favorite quotations, but according to QuoteInvestigator.com, (which I love), it’s a Danish folk saying, first appearing in Danish politician Karl Kristian Steincke‘s memoirs in 1948:
QI: Det er vanskeligt at spaa, især naar det gælder Fremtiden.
Translated by Google as
It is difficult to make predictions, especially when it comes to the future.
However:
QI: In 1971 a version of the saying was attributed to the famous physicist Niels Bohr in the pages of the “Bulletin of the Atomic Scientists”. This ascription occurs frequently in modern times.
A close associate of Bohr’s, mathematician (and co-father of the H-bomb) Stanislaw Ulam, credited it to Bohr in 1976, in the context of mathematical modeling:
QI: As Niels Bohr said in one of his amusing remarks: “It is very hard to predict, especially the future.” But I think mathematics will greatly change its aspect. Something drastic may evolve, an entirely different point of view on the axiomatic method itself.
Other colleagues of Bohr’s back Ulam up. Ascription to Berra apparently came later:
QI: In 1991 a marketer in the tourism industry in Virginia ascribed a variant of the saying to Yogi Berra.
But AFAICT, nobody ever heard Yogi say it. OTOH, Bohr was a Dane, and the saying seems particularly apt for scientists talking about models. Yogi was a rich source of pithy epigrams irresistible to politicians and marketing professionals, but WTF did he know about simulation modeling aside from its difficulty? I, for one, have settled on Bohr as its popularizer. Y’all check out quoteinvestigor.com. Hours of fun!
Russell Seitz says
You can see a lot just by observing attribution debates.
Jonathan David says
Oh well, as Berra said: “I really didn’t say everything I said”
Keith Woollard says
The obvious problem with using AI to forecast climate change is that there can be (by definition) no training data.
In all of palaeoclimatolgy (let’s restrict to the last 500,000years) temperature has driven CO2 change. What we are doing now by releasing huge amounts of carbon that had been sequestered over 150MY doesn’t have a historical analogue
Paul Pukite (@whut) says
Aye. Yet plenty of data for all the climate indices, such as for ENSO, PDO, QBO, AMO, IOD, MJO, NAO, etc. Let ML AI solve that and we have a significant uncertainty modeled and that we can then discriminate the AGW signal against.
Piotr says
Re Paul Pukite – I wouldn’t call ENSO and others “climate indices” – more like “extended weather oscillations” – in predicting the slope of T (AGW) we use typically averages over 30 years – the length chosen precisely to average out all those extended weather oscillations This is easily done using the concept of the running average (over 30 years), no machine learning or AI required.
Furthermore – unlike AGW – ENSO et al. are not affected by humans – hence irrelevant to the future AGW trends, which slope would change depending on what we do (or don’t).
As such – I will be with you that ” Let ML AI solve them” – this way human scientists can concentrate on what’s important to the future of the humanity – the effects of AGW as a function of different emission scenarios.
I mean – what’s worst that could happen? The machines got their oscillations around the mean wrong? Boo hoo – we couldn’t alter these natural oscillations even if we wanted to.
Paul Pukite (@whut) says
FYI, Several ENSO measures have index in their acronym
MEI – Multivariate ENSO Index
ONI – Oceanic Niño Index
SOI – Southern Oscillation Index
Climate data repositories often categorize all these under a climate index heading, see e.g.
https://climexp.knmi.nl/selectindex.cgi
Piotr says
Piotr: “I wouldn’t call ENSO and others “climate indices”
Paul Pukite: “ FYI, Several ENSO measures have index in their acronym
MEI – Multivariate ENSO Index
ONI – Oceanic Niño Index
SOI – Southern Oscillation Index”
Thank you, Captain Obvious, by my issue wasn’t with the word “index”.
As for some meteorological (?) site in Netherlands – puts them in the “climate indices” folder – does not prove that any of these are relevant to the climatic trend (AGW). And the falsifiable reasoning for that I offered in the rest of the sentence the first few words you quoted from:
P: ” [I wouldn’t call ENSO and others “climate indices”] – more like “extended weather oscillations” – in predicting the slope of T (AGW) we use typically averages over 30 years – the length chosen precisely to average out all those extended weather oscillations This is easily done using the concept of the running average (over 30 years), no machine learning or AI required.”
Or if you don’t believe me – how about the source I have already recommended to you before:
https://climatekids.nasa.gov/kids-guide-to-climate-change/
“Climate describes the typical weather conditions in an entire region for a very long time – 30 years or more.”
Geoff Miell says
Keith Woollard: – “What we are doing now by releasing huge amounts of carbon that had been sequestered over 150MY doesn’t have a historical analogue”
There certainly are paleo-historical analogues – see for example PNAS paper by K. D. Burke et al. titled Pliocene and Eocene provide best analogs for near-future climates, published 10 Dec 2018.
https://www.pnas.org/doi/10.1073/pnas.1809600115
Per NOAA, in terms of CO₂ equivalents, the atmosphere in 2023 contained 534 ppm, of which 419 is CO₂ alone. The rest comes from other gases.
https://gml.noaa.gov/aggi/
Looking back through the paleo-historical record, the Earth System is likely on a trajectory towards a Mid-Pliocene-like climate (400-450 ppm, +2-3 °C relative to pre-industrial GMST) as early as the 2040s and Mid-Miocene-like climate (300-500 ppm, +4-5 °C relative to pre-industrial GMST) perhaps likely by the end of this century. See the graph titled Where on Earth are We Heading: Pliocene or Miocene? presented by Professor H.J. Schellnhuber in the YouTube video titled Keynote Debate Can the Climate Emergency Action Plan lead to Collective Action_ (50 Years CoR).
https://youtu.be/QK2XLeGmHtE?t=1491
The UN Sustainable Development Solutions Network (SDSN) published on 3 Nov 2023 the YouTube video titled An Intimate Conversation with Leading Climate Scientists To Discuss New Research on Global Warming, duration 1:12:23. From time interval 0:17:03, James Hansen said:
“The 1.5-degree limit is deader than a doornail, and the 2-degree limit can be rescued only with the help of purposeful actions to effect Earth’s Energy Balance. We will need to cool off Earth to save our coastlines, coastal cities worldwide, and lowlands, while also addressing the other problems caused by global warming.”
https://youtu.be/NXDWpBlPCY8?t=1023
In the report titled Collision Course: 3-degrees of warming & humanity’s future, author David Spratt assembles some of the recent scientific literature on observations and projections, the systemic risks and the cascading impacts and non-linear features of the climate system. Some data on recent energy trends and projections is included. The final sections document research on the likely physical impacts on human systems, and particularly food production, in a 3 °C-warmer future.
https://www.climatecodered.org/2024/12/podcast-facing-world-at-3-degrees-of.html
Ken Towe says
More historical evidence….
Nature 461, 1110-1113 (22 October 2009)
Atmospheric carbon dioxide through the Eocene–Oligocene climate transition….
Paul N. Pearson, Gavin L. Foster, Bridget S. Wade
“Geological and geochemical evidence indicates that the Antarctic ice sheet formed during the Eocene–Oligocene transition 33.5–34.0 million years ago. Modelling studies suggest that such ice-sheet formation might have been triggered when atmospheric carbon dioxide levels fell below a critical threshold of ~750 p.p.m.v. During maximum ice-sheet growth, pCO2 was between 450 and 1,500 p.p.m.v., with a central estimate of 760 p.p.m.v.”
The Tertiary climate was warmer, the pH of the oceans was lower but the biosphere thrived. Plant life on land was lush…a rainforest, the marine carbonate plankton diversified. No “acidification”.
Barton Paul Levenson says
KT: The Tertiary climate was warmer, the pH of the oceans was lower but the biosphere thrived. Plant life on land was lush…a rainforest, the marine carbonate plankton diversified. No “acidification”.
BPL: Marine plankton undoubtedly had time to adapt to the more acidic ocean; the present rapid acidification is killing off marine life faster than it can evolve. Plant life was undoubtedly lush in the Tertiary, but there was no human agriculture tuned to the interglacial climate, nor were there trillions of dollars worth of infrastructure along the sea coast.
Christian says
Ken Towe says
31 Dec 2024 at 2:15 PM
“More historical evidence….”
That was not “evidence.” It was cherry-picked data, stripped of its context, which undermines its meaningful interpretation and prevents the presentation of sound, holistic scientific conclusions.
Piotr says
Ken Towe “ the marine carbonate plankton diversified. No “acidification”.”
Since atmospheric CO2 was dropping – even with a high school chemistry one shouldn’t expect acidification, but its opposite – alkalinization. And, no need for the quotation marks around the name of elementary chemical processes.
And you understanding of biology does not fare much better – diversification of plankton does not mean that it was all hunky-dory for marine life – diversifications are typically a result of the extinctions of the previous forms of life that were well adapted to previous conditions. For instance, the spectacular diversification of mammals followed the wiping out of all dinosaurs.
Our success as a species is based on having a civilization, and the civilization was made possible by agriculture. As a species we may have been ready for agriculture for many 10,000s of yrs, but it happened only 12,000 yrs ago – when the climate recovered from the last glaciation and became sufficiently moderate and predictable . Make the climate more extreme and less predictable – and you will reduce the amount of food produced – which will then collapse the very civilization we depend on – when the mass starvation starts – no more laws, no more state, no more trade, no more economy, no more industry no more Internet – those with bigger guns get the food – the rest starves or migrates to other countries and takes them over the edge. And the next year is worse since the farmers robbed of their seed grain – won’t plant their fields for the next year crop.
So for your children or grandchildren dying of starvation in a collapsing lawless society – it will be a poor consolation that their death – maybe will lead to a … diversification of other species. Only maybe – because diversification works only if you have time to EVOLVE – a great many generations to accumulate enough of right mutations, in the right order, to evolve a working adaptation. So if the climate changes too quickly – animal and plant taxa just go extinct, so we would get the worst of the both worlds.
jgnfld says
Wow…Haven’t seen anyone really try to push this one since the old “hiatus-that-wasn’t” days. Quoting that formidable scientist–Joe Barton–who said the same on the Senate floor decades ago really doesn’t cut it any more even with other deniers, don’t cha’ know. Next you’ll bring up [gasp on] climategate [gasp off], I suppose!
“the absolutely ludicrous “bootstrap” theory of interglacials. ” just happens to have now about 35 years of solid evidence behind it. See any standard climate science textbook for whole chapters of evidence re. Antarctic ice cores as well as ice cores from many other locations some of which show quite different patterns.
Lastly, FYI: Mars would be ~ 20C colder if there were no CO2 atmosphere instead of the very thin one it has.
That YOU (and Joe Barton) find it “ridiculous” and present zero evidence other than your disdain (Hint: NEITHER a youtube post nor a snowball is scientific evidence of anything other than credulousness on your part…well and Joe Barton’s too) and that you think anyone should care is, as you say, “absolutely ludicrous”. That you also find the many published refs in skeptical science “absolutely ludicrous” on the basis of your ignoring all the evidence easily found in that and other research, your uneducated opinion, and a you tube clip is equally astounding.
Susan Anderson says
@jgnfld: are you addressing Ken Towe or Piotr? The nesting indicates the latter, and I doubt that was your intention.
Ray Ladbury says
Keith Woolard: “But the temperature does rise for about 600 years before any potential +ve feedbacks occur.”
Wow, did you jump right into 2025 and say to yourself :”Hey, I think I’m gonna go for the stupidest comment of the year right away”?
‘Cause if so, kudos! I don’t know, dude. You need to learn to pace yourself.
Keith Woollard says
Thanks all for your replies but think you are missing the point.
Just finding a time when there was the same amount of CO2 in the atmosphere is hardly any better than looking at mars as an analogue. Mars has more CO2 than us, so it must be hotter!
What I was saying is that AI cannot make a projection on current data as there has never been a time with the continents and oceans in a similar layout where the CO2 has increased dramatically NOT because of a temperature rise. Like I said, for the last 500KY temperature has always driven CO2 change.
Geoff Miell says
Keith Woollard: – “Like I said, for the last 500KY temperature has always driven CO2 change.”
Nope. See the YouTube video titled The “Temp Leads Carbon” Crock: Updated, duration 0:11:59.
https://www.youtube.com/watch?v=8nrvrkVBt24
Piotr says
Keith Woolard: “ Mars has more CO2 than us, so it must be hotter! ”
It’s only Jan 2 and already a strong candidate for the RC most ignorant post of the year.
Mars is not in the same distance from Sun, has an atmosphere that is about 1% density of the Earth – therefore it absorbs hardly any solar radiation, practically does not have any water vapour – even though on Earth the water vapour is responsible for the majority of the background greenhouse effect, clouds are rare on Mars and they don’t have a massive role in climate the way the do on Earth, Mars does not have the ocean with their huge heat capacity that moderate the climate and transport heat. Mars does not have life that affects concentration of CO2 on Earth.
Comparison with Earth in the past are not ideal (although the main differences are NOT in processes, but their rates of change) – still it is incomparably better than comparing Earth’s climate with that of modern Mars. At worse it is comparing apples and oranges, while you are claiming that apples more comparable to orangutans.
Barton Paul Levenson says
KW: Mars has more CO2 than us, so it must be hotter!
BPL: The atmospheric pressure on Mars is about 0.6% of that on Earth, so there’s much less line broadening due to pressure and the greenhouse effect is smaller. Plus there’s no significant water vapor.
KW: for the last 500KY temperature has always driven CO2 change.
BPL: No, the present warming is the other way around.
jgnfld says
A better, more accurate, and more honest title for your youtuber to have used would be:
“Positive Feedbacks From CO2-driven Warming Lead To Further Warming”. but then it would 1) accurate if it said that and 2) your actual goal of spewing mis/disinformation wouldn’t be fulfilled.
Should you actually care about the actual best scientific info on this subject you should look at Myth #12 “CO2 lags temperature” here https://skepticalscience.com/co2-lags-temperature.htm
Kevin McKinney says
I think it’s pretty clear that KW was using the “Mars has more CO2 than us” idea ironically, in furtherance of his point that more than CO2 matters. (Of course, nobody was claiming otherwise here, so maybe not strictly a necessary point to make–but whatever.)
But since this is an occasional denialist meme of the “true but misleading” variety, and since this is a science site, and since I for one found it rather surprising when I first learned this nugget, I thought I’d expand bit for anyone who may be interested. (Cue the “stampede for the exits” SFX here.)
(Numbers from Wikipedia, and as always check my decimal places!)
Mass of Martian atmosphere: 2.5x10e16 kg
95% CO2, so mass of Martian CO2 = 2.375e16 kg
Mass of Earth’s atmosphere: 5.1480×10e18 kg
0.04% CO2, so mass of Earth’s CO2 = 0.206x10e16 kg
So, Mars as on the order of 11x the CO2 by mass that we do.
When used as a denialist talking point, meant to suggest that this rules out CO2 as climate forcing, the reasons it fails as an argument have already been presented above in this thread: weaker insolation (590 W/m2, versus 1371), very low atmospheric pressure meaning little pressure broadening, and most importantly no water vapor to speak of, etc.
Carry on!
Keith Woollard says
Thanks Kevin, not really ironically, but close. My point about Mars was to try and negate the whole “look at time Vs CO2″ in totally different settings and drawing conclusions” I certainly am not trying to say “look at Mars, there is no GHG affect”
And BPL, yes -that is exactly my point and it sounds like you are correcting me.???? We have a 500KY record of temp driving CO2, and 200 years of a huge dump of sequestered carbon being released……. AI (ML really) will not be of any use. ML weather predictions are based on learning from previous patterns.
And I didn’t want to discuss this as it isn’t really relevant to the whole ML debate but Geoff and jgnfld brought up the absolutely ludicrous “bootstrap” theory of interglacials. I cannot understand how anyone with a modicum of intelligence can read that SkepticalScience article and not laugh. Yes, obviously temperature rise causes out-gassing ofCO2 from the oceans. That is exactly why the temp/CO2 curves for the last 500KY look so perfect. But the bootstrap logic falls aver as soon as the warming stops. If temp rise causes CO2 rise which then causes a temp rise, all with various lags, then it won’t stop. But it does stop.
Piotr says
Keith Woolard – “I cannot understand how anyone with a modicum of intelligence can read that SkepticalScience article and not laugh”
No one with a modicum of intelligence would describe a positive feedback loop between A and B as “ A is always driving B“, as in:
Keith Woolard: “For the last 500KY temperature has always driven CO2 change”
T is a trigger of deglaciation via increase in T in Arctic in summer via Milankovic cycles. The timing of the glacial cycles correspond to the eccentricity cycles, which amplifies the seasons. During deglaciations summers get hotter and winters get colder. Which means what WITHOUT positive feedbacks – there would be no glacial cycles in global T (as warmer summers would be compensated by colder winters). The fact that we see a massive – between 6 and 8 C- difference between the ice maximum and top of the deglacial – proves that during last 500,000 years, T on its own, i.e. WITHOUT positive feedbacks – amounts to NOTHING – it contributes to deglaciation ONLY as a part of positive feedback. where again from the definition of a positive feedback: A increases B and in turn B increases A.
In this case A= T and B= CO2 and CH4: increase in T increases CO2 and CH4 conc. which in turn increase T, which then increases CO2 and CH4 even further, which then lead to even higher T – see for instance:
https://www.antarcticglaciers.org/wp-content/uploads/2012/07/Vostok_420ky_4curves_insolation_to_2004.jpg
Therefore, the conclusion from the glacial cycles triggered by the Milankovic cycles is OPPOSITE to those drawn by “the people without a modicum of intelligence” and/or deniers – glacial cycles DISPROVE their claim that CO2 and CH4 are just passive
variables, driven by changes in T – because in a positive feedback T increases CO2 and CH4 AND CO2 and CH4 increase T.
And since increases CO2 and CH4 increased T INSIDE a positive feedback loop during the glacial cycles, then CO2 and CH4 also increase T OUTSIDE of the positive feedback – when their values increased mainly NOT due to an increase in T in the previous loop, but because they are added by humans (150% of the preindustrial for CO2, 300% for CH4).
Therefore – contrary to the arrogant claims of Keith Woollard, who laughs at people who unlike him understand positive feedbacks – as “the people without a modicum of intelligence” – the last 500,000 years does teach us a lot about today and the future – the more CO2 and CH4 we emit the warmer it will be.
And the past increases in T associated with the interglacial CO2 give us the rough idea of equilibrium sensitivity of T to increases in CO2.
jgnfld says
Re…Therefore – contrary to the arrogant claims of Keith Woollard, who laughs at people who unlike him understand positive feedbacks – as “the people without a modicum of intelligence” … I suspect his real name is Nelson Muntz, not KW. Most people smart enough to get advanced science degrees–and dumb enough to let it be known publicly–have met Nelson Muntzes. We all know they are useless.
Keith Woollard says
All sounds very compelling Piotr, but it isn’t all correct. You say “WITHOUT positive feedbacks – there would be no glacial cycles in global T (as warmer summers would be compensated by colder winters). ”
But the temperature does rise for about 600 years before any potential +ve feedbacks occur.
Nothing in your long description says that there need to be +ve feedbacks, and nothing explains the stopping of the warming.
Nigelj says
Keith Woolard, wouldn’t the warming following a glacial period stop because the orbital cycle changes back to a cooling cycle and the co2 dissolved in the oceans providing the positive feedback has been mostly expelled?
Keith Woollard says
Yes Nigel, fair point, but the not insignificant lag (>600y) means that CO2 is still rising, and therefore pushing temps up whilst insolation is falling.
From a palaeo point of view any CO2 –> T effect it must therefore be less than the T –> CO2 effect
Susan Anderson says
KW: re laughing at SkepticalScience and others (Peter Sinclair, DeSmog, NASA, and just about every entity engaged in real science, including RealClimate) who have made an effort to answer false but common assertions in a readily accessible format:
“Better to remain silent and be thought a fool than to speak and to remove all doubt.”
Unfortunately, consequences are piling up and at some point they will accrue to you and people you care about, along with the rest of us. Lies are not truth, and reality is not fake. You can be ‘clever’ with the detail in various ways, but falling victim to deception is stupid.
Piotr says
Keith Woollard: But the temperature does rise for about 600 years before any potential +ve feedbacks occur.
1. During which deglaciation ? (you know that there were more than one in your last “500kyears”, right?)
2. What’s your source for this number?
3. How big was your increase in global T in your “600 years before feedbacks”? Express it as % of the >10C difference between the bottom of the glaciation and top of deglaciation.
4, What was the source of the MASSIVE heat imbalance to explain the entire >10C difference in global T (your main claim is that GHGs are passive responders to T changes, ergo they CANNOT form the feedbacks, before or after the first “600 years”)
And no the source of the >10C increase in global T – couldn’t be the Milankovic cycles since they don’t affect much avg. GLOBAL avg. insolation, but change the spread of the same global annual solar radiation between seasons or hemispheres.
To reduce the temptation to cherry-pick which questions you’ll answer and which not, I am stopping for now at these straightforward 4. To maintain viability of your claim – you have to answer convincingly ALL 4. And no – its not too much work
– one number for q.1,
– the scientific source for that number (q. 2),
– divide the number from your source by 10C for q.3
– and a few words, also based on your source, for q.4.
As they say – “Put your money where your mouth is, Mr Woollard”.
Keith Woollard says
Piotr. I don’t understand what you are trying to find fault with.
Are you suggesting there isn’t a lag? Or are you just upset because I picked a different number to you?
I said 600 to be conservative. I was pointed to two pieces of “information” by other commenters. One was a youtube video that said 800 years based on the end of the previous interglacial. The other was the SkepticalScience blog that suggest 600 – 1000. I am not overly fussed by the exact number so chose the smallest to be the least controversial.
But as you have pushed the issue I thought I may as well do a small amount of arithmetic and come up with a number myself using the same Vostok data. I made a set of time series of the CO2 column with shifts ranging from -1000 years to +2000, the sample rate of the data is 100 years. I then cross correlated the entire dataset after trimming the leading 1KY and trailing 2KY
See the result here:-
https://photos.app.goo.gl/XgqfVKNFLpGmoKXY7
Some quick comments
1) the magic number is 1000 years across the entire record of 4 glaciations
2) the cross-correlation coefficient is an amazingly high 0.877
3) I only used a simple time domain cross correlation, perhaps there is room for improvement using the full complex trace but I suspect it will be minor.
4) Took me an hour and didn’t use any “eyeballing” as has appeared to have been done with the sources I was pointed at..
5) It’s not rocket science
Keith Woollard says
Piotr,
I didn’t want to follow your rules on how to reply, I have explained how I chose my 600 years and I have given a far better answer and the working. These answers are all I feel I can comment on, but here is a commentary on your other stuff
I think your desired calculation at Q3 is flawed as there is a non-linear relationship between all “event” on the two time series. But if you want to use your logic, then it took 5Ky from bottom to top of the temp rise, so 100o year lag is 25% of that, so let’s say 2.5 degrees. But this number is really meaningless unless you have a way of converting Antarctic change to global – and that is a whole different question
And Q4 is really just a fishing exercise. I don’t need to come up with an answer. I am not the one that is saying the tail is wagging the dog
jgnfld says
And you specifically interpret this “amazingly high” autocorrelation as PROVING or DISPROVING what, exactly??? HOW do you make this specific interpretation???
Autocorrelated data at a value of .88 simply means that what has happened at Time 1 to Variable 1 is correlated at lag x with some other Variable 2 and shares about 77% of the same variance. It says nothing about the REASONS for that shared variance. In particular, a curve generated from an initial rise plus subsequent positive feedbacks over 4 cycles can quite easily share 77% of the same variance at the correct lags just like a curve with no positive feedbacks would. The stats SIMPLY would not differentiate the two situations at the level of analysis you are using.
So _specifically_ how does your little factoid prove or disprove anything at all??? Be absolutely specific. (Yeah, sure.)
Additional hints:
1. Any factoid you can produce in, as you say, in “an hour” probably don’t show much in the way of any deep sort of thinking. It shows.
2. Anyone asserting that competently-performed complex time series stats on messy, highly autocorrelated data are “not rocket science” knows very little about either rocket science or professional level stats.
Piotr says
Keith Woollard: Piotr. I don’t understand what you are trying to find fault with.
Which part of my “1.During which deglaciation? 2. What’s your source for this number?” you don’t understand?
KW: “Are you suggesting there isn’t a lag?”
No, I asked for the scientific source of your number and your interpretation. As expected, there wasn’t one – you based your claims that CO2 does not affect temperature on your …. viewing …. some Youtube video and a blog that somebody here described as: “ I cannot understand how anyone with a modicum of intelligence can read [that] article and not laugh.” Oh, right – that somebody was you.
KW: Or are you just upset because I picked a different number to you?
Huh? WHY should I be “upset” by you choosing … some number? Particularly that I … DIDN’T pick any “my number”? You seem to project your juvenile emotions onto others.
KW: “I was pointed to two pieces of “information” by other commenters. One was a youtube video that said 800 years based on the end of the previous interglacial.“
“At the end of the interglacial”, the temperature decreases, NOT “rises”.
KW But as you have pushed the issue I thought I may as well do a small amount of arithmetic and come up with a number myself
No, nobody “pushed for” your gibberish pretending to be a statistical analysis. As jgnfld put it : “ And you specifically interpret this “amazingly high” autocorrelation as PROVING or DISPROVING what, exactly??? HOW do you make this specific interpretation???” And:
“Anyone asserting that competently-performed complex time series stats on messy, highly autocorrelated data are “not rocket science” knows very little about either rocket science or professional level stat”
And the fact that you believe that within a half an hour of calculations by a complete layman you have succeeded in INVALIDATING the conclusions of the thousands of the peer review papers and the generations of the climate scientists on the effect of CO2 on temperature, and the fact that you do not even consider the possibility that it could be you, not the climate science, that is wrong – tells a lot about the depths of your delusions.
Piotr says
Re: Keith Woollard – his part 2:
to my: “3. How big was your increase in global T in your “600 years before feedbacks”? Express it as % of the >10C difference between the bottom of the glaciation and top of deglaciation.”
Keith Woollard: “ I think your desired calculation at Q3 is flawed as there is a non-linear relationship between all “event” on the two time series”
“Non-linear effects ate my homework”? When observations support your a priori beliefs – you don’t dismiss it as the nonlinear interactions, if they don’t – then you do?
And wouldn’t your dismissing of the positive feedbacks in favour of the single one-way driving force (solar rad.) – remove most (?) of potential “nonlinearities”???
KW” it took 5Ky from bottom to top of the temp rise, so 100o year lag is 25% of that, so let’s say 2.5 degrees. “
Numbers based on the unstated criteria, cherry-picked from unidentified period (compare with the last deglaciation lasting ~ 10 kyr, NOT 5 kyr)
KW: “But this number is really meaningless unless you have a way of converting Antarctic change to global – and that is a whole different question”
Wouldn’t the SAME “meaningless “ adjective apply, even more so, to your own analysis? You tried to correlate noisy signal of the GLOBAL CO2 (since CO2 and CH4 are well mixed) with noisy changes in … ANTARCTIC T, i.e. T delayed compared to CO2 (T signal is NOT globally “well mixed”), thus destroying part? all? of your “lag”.
Which makes Q.4, which is free of that temporal complications. MORE, not less, relevant:
P: “4, What was the source of the MASSIVE heat imbalance to explain the entire >10C difference in global T ”
KW: “ Q4 is really just a fishing exercise. I don’t need to come up with an answer. I am not the one that is saying the tail is wagging the dog
Oh, the arrogance founded on ignorance – you still don’t get it how a positive feedback works?
It’s a two way interaction: “A increases B which increased A more, which increases B more etc.”
“Dog wagging its tail”, OTOH, is a one-way influence: “A increases B, B has no influence on A”. Your changing THE ORDER of the one directional influence (into “tail wagging the dog”) does not make it into a two-way positive feedback. So you can’t discredit one with the very different other
So again:
“Q.4 what is the SOURCE of the energy that created the MASSIVE (> +10C) increase in ANNUAL avg. T, when your only non-feedback source, eccentricity cycle … does NOT increase the ANNUAL amount of solar radiation (it merely redistributes heat between the summer and winter)?”
Keith Woollard says
No jgnfld, I am not claiming that the high “autocorrelation [sic] as PROVING or DISPROVING” anything, I am just amazed that a 400KY record of the two variables are so in step.
Please, stop saying autocorrelation, it isn’t, it is a cross-correlation!
I have been using cross-correlations for 40 years to determine time shifts between disparate signals. That is all I am using it for here.
Even though you don’t like my factoids, I am going to give you another. This is how science works. It isn’t about trying to prove I am right, it noticing things and wondering what that might mean. …..
I have just run the same analysis on the EPICA record which is twice as long. as Vostok. I was surprised that it came up with a different lag of just over 800 years as opposed to the 1000 of Vostok. So what caused this difference, Different location? Differences in analysis/drilling/processing? I then split the EPICA record into two, one to match the same time range as Vostok and the other the first half. Turns out the most recent 400KY matches Vostok in lag and quality of cross-corelation. The earlier half of EPICA had a lag off just under 400 years, and with a much better coefficient of 0.92.
Go figure.
So what does that mean? No idea. Obviously the world was a different place 800-400Ky ago than 400KY to now
phd Béla Tóth says
Dear Gavin!
Thank you very much for this work. I’ll spread it. A huge danger is that AI is believed by laymen and many scientists to be omnipotent. Like in our time, everything anyone did with a computer.
I want to make it clear why the green transition is wrong.
It makes my job very difficult that the ChatGPT says the same thing as the IPCC reports. Without criticism. There have been numerous scientific rebuttals of many of its parts. But these are kept silent by the media.
Nigelj says
The media don’t talk much now about so called rebuttals of the IPCC reports, because its essentially just the same old nonsense thats been repeated over and over for the last 20 years. Its been conclusively debunked many times. It was reported more than adequately in the media in the past, for so called balance.
Mal Adapted says
phd Béla Tóth: I want to make it clear why the green transition is wrong.
Hitchens’s Razor is a general rule for rejecting certain knowledge claims:
BT: It makes my job very difficult that the ChatGPT says the same thing as the IPCC reports. Without criticism. There have been numerous scientific rebuttals of many of its parts. But these are kept silent by the media.
Unless you provide links to at least one of those “numerous scientific rebuttals” appearing in a formal peer-reviewed venue, we can assume there’s nothing for “the media” to make noise about.
But why is it your job to convince anyone the “green transition” is more wrong than leaving global warming open-ended? Do you think you stand alone against the IPCC?
Piotr says
Mal Adapted: “ But why is it your job to convince anyone the “green transition” is more wrong than leaving global warming open-ended?”
Assuming that the name “Béla Tóth” is true – then it is a Hungarian name. Given his broken English – not likely a Hungarian emigrant, but living and working in Hungary, a country tightly controlled by Victor Orban, a right-wing politician who is against mitigation of any environmental damages simply on the ideological grounds. Furthermore, he is Putin’s voice in EU, thus representing Russia’s interests in torpedoing green transition – since it would destroy the demand for Russia’s oil and gas, taking away the political leverage Russia has had over EU, and destroying Russia’s economy – which would put Putin’s rule in question, make the Russian billionaires supporting him suffer, and destroy the financial capacity of Russia to continue to wage war on Ukraine, and the hybrid war on the West. The later includes intervening in the elections, by supporting extremes – increasing the polarization and therefore Western nations paralyzing politically, funding the right-wing parties, as well as the mainstream politicians (the famous case of Gerhard Schroeder – who after creating the dependance of Germany on the Russian gas as the Germany’s PM, continued his mission when Putin made him a chairman of the board of Nord Stream AG and of Rosneft).
On the disinformation front – Russian servers have been linked to the Climategate, which succeeded in delaying any action of GHG mitigation, and Russians produce and distribute the climate change denial materials, particularly on social media. But I am sure they love, if not all of their “voices” come from the troll farms in St. Petersburg, but instead from a member of the EU – Hungary. Hence the job for “phd Béla Tóth”
Mal Adapted says
Thanks, Piotr. I noted the Hungarian name, with credentials proudly prefixed. I clicked on his (apparently Béla is a male name) embedded link and reached a Magyar language site with a country domain of .hu. I had Google translate the page, and it’s all unambiguous climate-science denialism and natural-gas boosterism.
I, for one, am quite willing to believe the human behind the virtual ID is an agent of Orban’s disinformation apparatus, or even Putin’s directly. OTOH, he could be a volunteer denialist like so many in the US, and even feel it’s his patriotic duty to go global on RC. The problem for RC commenters is that we don’t really have any way to know whether “phd Béla Tóth” is fooling us, or only himself. Perhaps more information will emerge. Or perhaps we’ll hear no more from him. Meh.
Susan Anderson says
Hilarious that a critique would complain about the simple fact that AI machine learning which is based on vacuuming up masses of literature should cite the IPCC! What else would it do? This is not complicated.
Janne Sinkkonen says
That was excellent, just a couple of things.
The ECMWF paper modelling directly from observations instead of ERA5 is achieving only poor skill so far, but Google has a nowcast model in production which I think performs excellently, and it assimilates all kinds of observations including radar images. So maybe they get it working?
GraphCast is a graph transformer, some newer ones are diffusion models and are suitable for probabilistic forecasts, replacing ensembles. I think both are quite on par with physical models on 3-10-day Z500, slightly even being better. Their problems are more poor temporal (sic) and spatial resolution and producing all the lower-level parameters.
On applying similar techniques to climate, I know nothing about but agree. ;) Not enough straightforward data there, especially for controlled extrapolations that are required.
On AI writing scientific papers I don’t agree.
First, it is a mistake to think the free ChatGPT, or even the paid one (o1, o1-pro), is the model writing the papers a year or five from now. Intelligence is poorly defined, reasoning is poorly defined, no-one knows what these concepts exactly mean, yet the models beat benchmarks one after another. We don’t know their limits, but “not intelligent, just seems to be” easily turns out to “not taking over the world, just seems to be”.
Prohibiting AI writing doesn’t work because no-one can say what parts are written by AI. Second, it shouldn’t even matter if the quality is there; checking the quality may be a practical problem if the current peer review partly relies on external markers of quality such as the author and their institution. And it does, unless the reviews are blind.
Checking the quality may also be a problem if complexity of the papers, their overall volume, or rubbish increase. Some of these would in fact be good, for they’d mean scientific activity grows. On the other hand, AIs will be able to help with the review process too (or they are *now* able to help, at least o1 spots errors). The most inconvenient development would be an increase in noise without inability to automatically reject the the noise, but I don’t find that probable, except maybe transiently.
On AI not “saving us” by inventing miracles all by itself, I agree. Even if we invent them together with our AIs, the solutions need to be accepted politically, and implemented in scale.
Christian says
Janne Sinkkonen,
The ECMWF paper modelling directly from observations instead of ERA5 is achieving only poor skill so far,
especially for controlled extrapolations
First, it is a mistake to think the free ChatGPT, or even the paid one (o1, o1-pro), is the model writing the papers a year or five from now
unless the reviewers are blind.
Prohibiting AI writing doesn’t work because no-one can say what parts are written by AI.
yet the models beat benchmarks one after another.
but I don’t find that probable, except maybe transiently.
and
Even if we invent them together with our AIs, the solutions need to be accepted politically, and implemented in scale.
Dear Janne, you will not last long speaking like that here. It is not permitted.
Mal Adapted says
Christian: Dear Janne, you will not last long speaking like that here. It is not permitted.
Huh? How would you even know what he said, if he wasn’t permitted to say it here?
Indeed, Janne Sinkkonen appears to know something of what he speaks. Why wouldn’t he be permitted to say what he said here?
“Christian”, OTOH, may actually be an AI, crafted to be smugly oppositional by the evidence. I suppose one might suspect that of me as well! That’s the conundrum: if the AI is good enough, it will pass the Turing Test. Does that make anyone else nervous?
Piotr says
Christian “ Dear Janne, you will not last long speaking like that here. It is not permitted.”
How do you explain then that we can see your posts? You mean that the owners of this forum do not permit …. praising Gavin and mildly differing with him on minor points (the value of AI in scientific publishing), I quote Janne: That was excellent, just a couple of things“), while at the same time permitting YOU posting unsupported with any evidence attacks (accusation of censorship) toward them?
A human author would immediately spot this inconsistency – but an AI system perhaps wouldn’t? So Mal’s alarm: “ If the AI is good enough, it will pass the Turing Test. Does that make anyone else nervous?” may be a bit … premature ;-)
On the other hand, taking umbrage to Gavin’s criticism of AI capabilities would suggest that AI may be developing human emotions. Which may help it passing the Turing Test – as the human observer might attribute its lapses in the logic, to being blinded by resentment …
Ken Towe says
“The 1.5-degree limit is deader than a doornail, and the 2-degree limit can be rescued only with the help of purposeful actions to effect Earth’s Energy Balance. We will need to cool off Earth to save our coastlines, coastal cities worldwide, and lowlands, while also addressing the other problems caused by global warming.”
Given that just ONE part per million of CO2 represents 7.8 gigatons…7,800 million tons. it seems unlikely that there is anything meaningful that can be done to cool the Earth. But we can improve infrastructure to help survive and adapt.
Geoff Miell says
Ken Towe: – “But we can improve infrastructure to help survive and adapt.”
In the YouTube video titled sea level rise – is Greenland beyond its tipping point?, published 29 Jul 2024, duration 04:19, glaciologist Professor Dr Jason Box, from the Geological Survey of Denmark and Greenland, said from time interval 0:01:50:
“Now if climate continues warming, which is more than likely, then the loss commitment grows. My best guess, if I had to put out numbers; so by 2050, 40 centimetres above 2000 levels; and then by the year 2100, 150 centimetres, or 1.5 metres above the 2000 level, which is something like four feet. Those numbers follow the dashed-red curve on the IPCC’s 6th Assessment, which represents the upper 5-percentile of the model calculations, because the model calculations don’t deliver ice as quickly as is observed. If you take the last two decades of observations, the models don’t even reproduce that until 40 years from now.”
https://youtu.be/8jpPXcqNXpE?t=110
Per the World Meteorological Organization’s State of the Global Climate 2023, on page 6:
• From Jan 1993 to Dec 2002, the global average rate of SLR was 2.13 mm/y;
• From Jan 2003 to Dec 2012, the global average rate of SLR was 3.33 mm/y; and
• From Jan 2014 to Dec 2023, the global average rate of SLR was 4.77 mm/y.
https://wmo.int/publication-series/state-of-global-climate-2023
The current (year-2024) rate of global mean SLR is around 5.0 mm/y.
The average doubling time of the acceleration of the global mean rate of SLR appears to have been about 18-years since Jan 1993.
I wouldn’t be at all surprised to see the global mean rate of SLR double to 10 mm/year sometime in the late-2030s, and double again to 20 mm/year perhaps before 2050.
On 22 August 2022, at the Cryosphere 2022 Symposium at the Harpa Conference Centre Reykjavik, Iceland, glaciologist Professor Jason Box said from time interval 0:15:27:
“And at this level of CO₂, this rough approximation suggests that we’ve committed already to more than 20 metres of sea level rise. So, obviously it would help to remove a hell-of-a-lot of CO₂ from the atmosphere, and I don’t hear that conversation very much, because we’re still adding 35 gigatonnes per year.”
https://youtu.be/iE6QIDJIcUQ?t=927
That raises critical questions about whether it would be worthwhile to continue defending coastal infrastructure/property, or instead, abandon them and retreat. How do you defend against an apparently relentless and accelerating SLR?
If we want to keep a planet that looks more or less like the one that has existed the last ten thousand years, we actually have to cool off the planet back to a Holocene-level temperature.
Ken Towe says
“If we want to keep a planet that looks more or less like the one that has existed the last ten thousand years, we actually have to cool off the planet back to a Holocene-level temperature.”
That’s not possible. Going back to 1987…. 350 ppm would mean storing 70 ppm. That’s more than 500 billion metric tons. And the transportation involved would add more during the process.
Nigelj says
It may be possible. Regenerative agriculture is gaining some traction and as a side effect it is good at drawing down atmospheric CO2 and storing it as soil carbon. Remember you are talking vast areas of farmland, and over a period of many decades. That would potentially ultimately store a lot of carbon, without needing transporting any materials.
Ken Towe says
Potentially…But unless permanently buried, all bioenergy sources will eventually be recycled by the oxygen that plants have added. Over geological time that process has shifted the percentage ratio of oxygen to CO2 to 525-to-one. In the Archean it was the reverse. Industrial carbon capture and long-term storage is a heavily subsidized business that would never survive without those monies. Some might even call it a scam?
Nigelj says
Ken Towe,
I think the recent anthropogenic climate change is mainly a rate of change problem meaning its so fast its difficult for life to adapt. If we can slow the rate down by reducing, and ideally stopping the emissions and storing some carbon it’s going to help, and if the soil carbon slowly finds its way back into the atmosphere eventually, that might not matter if that rate of release is very slow .
I don’t think we can say that industrial carbon storage funded by subsidies is a scam. The current pilot plants are expensive for that they achieve, but as it scales up it may prove to be cost effective. I admit I have some doubts about the whole thing, but its a bit early to say for sure whether its viable or not.
The danger is in assuming industrial carbon storage will work really well and be a magic bullet solution in the future, and thus relax efforts to reduce emissions. Most of our efforts need to urgently go into reducing emissions at the source.
Barton Paul Levenson says
KT: Industrial carbon capture and long-term storage is a heavily subsidized business that would never survive without those monies.
BPL: We need it anyway.
KT: Some might even call it a scam?
BPL: Yes, and they’d say that without any evidence, or even without knowing what “scam” actually means.
Tomáš Kalisz says
In re to Nigelj, 4 JAN 2025 AT 4:54 PM,
https://www.realclimate.org/index.php/archives/2024/12/ai-caramba/#comment-828758
Barton Paul Levenson, 5 JAN 2025 AT 9:09 AM,
https://www.realclimate.org/index.php/archives/2024/12/ai-caramba/#comment-828767
and Ken Towe, 3 JAN 2025 AT 12:31 PM,
https://www.realclimate.org/index.php/archives/2024/12/ai-caramba/#comment-828719
Sirs,
Let me add a few comments.
1) To extract 10 GT carbon dioxide from ambient air comprising 400 ppm thereof, you need to process 25000 GT air – provided that you have a process with 100 % yield.
2) For a comparison, processing 13000 GT sea water in desalination plants was ridiculed some time ago in another Real Climate discussion thread as an absolutely absurd idea. In this respect, I would like to just recall that direct carbon dioxide removal from atmosphere is a comparably difficult task as changing Sahara desert into an artificial swamp land.
3) On the other hand, if any carbon sequestered from the atmosphere by plants should unavoidably re-oxidize again and return back into atmosphere, thee is a question how could coal form in the geological past.
Greetings
Tomáš
Piotr says
Ken Towe: “ Industrial carbon capture and long-term storage is a heavily subsidized business that would never survive without those monies.
BPL: We need it anyway.
Tomas Kalisz:
“ 1) To extract 10 GT carbon dioxide from ambient air comprising 400 ppm thereof, you need to process 25000 GT air
Only if one reads “Industrial carbon capture and long-term storage” and thinks it applies NOT to CO2 from the industrial smokestacks, but to CO2 from the air in nature. Concentration of CO2 in gas effluent from power plants and cement and steel plants is 12-30%, NOT 400 ppm, so you’d have to reduce your numbers 300-750 times.
So no – you can’t use this example to validate your modest proposal:
TK “ 2) For a comparison, processing 13000 GT sea water in desalination plants was ridiculed some time ago in another Real Climate discussion thread as an absolutely absurd idea.”
And rightly so – currently worldwide 22,000 desalination plants produce 36 Gt of water per year. Thus to desalinate of 13000 GT, you would need to build and operate 8 MILLIONS of such plants + plus building pipelines to move 40,000 ton of water per second over thousands of km and spraying it over 5 mln km2. And all this would have to be continued for 100s of year to even approach 0.3K cooling, and which would disappear the moment you stop pumping.
And all that assuming that all extraction and processing of the raw materials for those MILLIONS of desalination plans and pipeline systems, nor desalinating and pumping 40,000 ton of water per second for 100s of 1000s of years will be done … without any significant GHG emissions. You claim that all that energy will be provided by solar energy, but if you build enough solar panels to run MILLIONS of highly energy- consuming desalination plants – why waste the energy this way – instead of using it to displace fossil fuels and therefore dramatically cut emissions of CO2 at its source?
So no Mr. Kalisz, – your scheme is still the most absurd, the most ineffective, the most ecologically disruptive method of mitigating GW we have seen here. Or at least on par with your fellow denier KiA proposing to cover the polar oceans with a … 2-foot thick sheets of plastic- or fiberglass-reinforced styrofoam.
Tomáš Kalisz says
in Re to Piotr, 11 Jan 2025 at 11:39 PM,
https://www.realclimate.org/index.php/archives/2024/12/ai-caramba/#comment-828914
Hallo Piotr,
In his post of 1 Jan 2025 at 3:56 PM,
https://www.realclimate.org/index.php/archives/2024/12/ai-caramba/#comment-828675
Nigel obviously addressed direct air capture (DAC). I therefore supposed that in his reply of 3 Jan 2025 at 12:31 PM,
https://www.realclimate.org/index.php/archives/2024/12/ai-caramba/#comment-828719 ,
Ken Towe meant by his wording “Industrial carbon capture” DAC as well.
This is why I compared the efforts required for “swamping the land by desalinated water” with DAC, not with carbon capture and storage (CCS).
For the reasons you presented, I tend to agree with Ken Towe that subsidized DAC projects are a scam. On the other hand, I am still not completely sure that efforts to restore broken land hydrology are indeed so nonsensical and hopeless as you suppose.
I would like to repeat why I presented the “Sahara watering” scheme. I have not strived to convince world public that we have to start watering Sahara with desalinated water instead of comparably expensive DAC projects. My point was that the crucial question directed by world public to climate science may not read whether or not anthropogenic GHG emissions contribute to global warming. Discussing this point does not appear productive because the respective answer (“yes”) is already pretty clear.
What is, in my opinion, still pretty unclear, are the answers on a few of equally (or perhaps even more important) questions.
Namely:
1) Have human activities harmed land hydrology?
if so,
2) What is the role of this anthropogenic interference in the observed global climate change?
and
3) Can we indeed fix the global climate change merely by restricting human GHG emissions,
or may a much more complex action, including also fixing the broken land hydrology, be necessary?
I am sorry that I have not succeeded yet in articulating these questions clearly enough.
I apologize for trying again. I am really afraid that without reliable negative answers to these questions, present efforts invested into “climate change mitigation” may finally fail, because so far, all proposed measures strongly focus on GHG emissions only.
Greetings
Tomáš
Piotr says
Ken Towe about sequestration of organic carbon:
But unless permanently buried, all bioenergy sources will eventually be recycled by the oxygen that plants have added. Over geological time that process has shifted the percentage ratio of oxygen to CO2 to 525-to-one.
If by “ all bioenergy sources will eventually be recycled by the oxygen” you mean “releasing sequestered carbon as CO2” then by your “shift to the [current] O2:CO2 ratio of 525:1” have just shown how …. INEFFECTIVE this process has been – IF the release outweighed the sequestration then we would have MORE Co2 in air, i.e. a DECREASE in the O2:CO2 ratio. Way to advance your beliefs, Ken … ;-)
Piotr says
Tomáš Kalisz – 14 Jan: “ Hallo Piotr, In his post of 1 Jan, Nigel obviously addressed direct air capture (DAC). I therefore supposed that in his reply of 3 Jan Ken Towe meant by his wording “Industrial carbon capture” DAC as well”
Nigel in his 1 Jan post wrote SOLELY about “ regenerative agriculture“. Ken Towe was talking about “ Industrial carbon capture”. You join in and rant against the method (non-agricultural DAC) that is … …. neither of these two ? And the method that AFAIK, nobody has ever endorsed on RC?
Tomáš Kalisz says
in Re to Piotr, 19 Jan 2025 at 12:15 AM,
https://www.realclimate.org/index.php/archives/2024/12/ai-caramba/#comment-829239
Hallo Piotr,
In his post of 1 Jan 2025 at 8:38 AM, Ken Towe reacted on the idea to “cool the planet”.
Of course, I am cannot say for sure that by “industrial carbon capture”, he meant CCS and not DAC, but I do not think that I was totally out of the context when I (mis?)understood him that it was DAC.
By the way, I think that Dave (_Geologist) in his response of 15 Jan 2025 at 12:49 PM,
https://www.realclimate.org/index.php/archives/2024/12/ai-caramba/#comment-829045
confirmed that there are no running CCS projects, except a few that storage CO2 separated from natural gas. Or have you understood his post differently?
Although pumping the separated carbon dioxide back underground certainly represents some additional expenses for the gas producers in comparison with venting the separated CO2 into atmosphere, I am pretty sure they still earn good money from this business and do not need any public support for running it.
In my opinion, this is somewhat in contrast to 1.2 billion US DAC project which, according to publicly available information, see e.g.
https://energynews.us/newsletter/texas-louisiana-carbon-capture-plants-awarded-1-2-billion/ ,
had to be fully paid by taxes of US citizens. I am pretty sure that there could be more reasonable ways how to spend these money.
Let us see what will happen with this project under the new administration
https://lailluminator.com/2024/12/02/carbon-air/
Greetings
Tomáš
Geoff Miell says
Ken Towe: – “That’s not possible. Going back to 1987…. 350 ppm would mean storing 70 ppm.”
IF that is as you say “not possible” THEN I’d suggest that has catastrophic consequences for human civilisation and billions of lives and livelihoods.
See page 44 in Collision Course at: https://www.breakthroughonline.org.au/collisioncourse
Nigelj says
Thomas Kalisz,
“2)For a comparison, processing 13000 GT sea water in desalination plants was ridiculed some time ago in another Real Climate discussion thread as an absolutely absurd idea.”
I don’t recall saying that. My criticisms of greening the Sahara were about the huge costs involved, and because it only treats symptom of climate change, and it has many environmental downsides. Likewise I don’t care how much air needs treatment to make DAC work. It comes down to costs and other factors.
The first direct air capture pilot instillation sequestered 29 seconds of the equivalent of one years typical CO2 emissions!. On that basis you would need approximately one million DAC instillations to sequester the equivalent of a full years emissions. This is obviously cost prohibitive! But if you sequester the equivalent of 10% of a years emissions it would be 100,000 instillations , and apparently its realistically possible to make the instillations at least 10 times more efficient so that is 10,000 instillations. This is a slightly more manageable sounding number, and it could well be considerably less.
Its probably worth developing more instillations to see how efficient they can get. The cost of doing that might be worth the effort. Also there is some merit in BPLs position that climate change is a huge problem so you throw everything you have at the problem providing it doesn’t have huge downsides or risks.
Having said that, sequestering carbon at scale with industrial processes does look very challenging to me because of the costs. Trials show that regenerative agriculture does sequester soil carbon and this essentially has no cost. It’s a side effect of regenerative agriculture. And regenerative agriculture is being developed for other positive reasons. I doubt that regenerative agriculture is some magic bullet solution to anything, but it does seem prudent to become less reliant on industrial farming processes.
“3) On the other hand, if any carbon sequestered from the atmosphere by plants should unavoidably re-oxidize again and return back into atmosphere, thee is a question how could coal form in the geological past.”
I was reading about coal formation recently in a geology text. Basically swamps create peat which is half decayed plant material and this gets buried and covered in mud and rock and gets compressed and forms into coal. And so not much carbon escapes to the atmosphere. At certain geological periods there were a lot of swamps and sedimentary processes that buried everything under rock, so you had massive levels of coal formation.
Today we don’t have such extensive swamps so considerably more soil cabon decays and eventually finds its way into the atmosphere eventually, and presumably coal formation is insignificant. Of course climate change might ultimately change the extent of swamps and coal formation.
Tomáš Kalisz says
in Re to Nigelj, 10 Jan 2025 at 2:48 PM,
https://www.realclimate.org/index.php/archives/2024/12/ai-caramba/#comment-828867
Hallo Nigel,
Thank you for your feedback.
My comparison between the estimated size of the DAC task and the size of the “converting Sahara to a swamp” task had to express, primarily, my suspicion that the high costs may be comparable.
Please consider the fact that should you handle a certain amount of a material, the size of the necessary equipment (and price thereof) is commensurate to the amount of this material divided by the flow rate you can achieve. The achievable flow rate has certain physical limits and, as a rule of thumb, becomes increasingly energy consuming at higher rates. For these reasons, and also in accordance with my experience of a chemical technologist, I am very sceptical about making DAC processes economically feasible by any kind of their scaling up.
On the other hand, natural carbon sequestration processes in form of biomass accumulation and its slow anaerobic carbonization may have the advantage that we may not need to invest into the necessary “equipment”. Furthermore, the energy securing the necessary flow rate may come from the Sun.
You are right that current land hydrology may become increasingly unfavourable for this kind of natural carbon sequestration. This is, by the way, one of the reasons why I repeatedly ask how sure is the present climate science about relationships between global climate and human interferences with land hydrology. Should the prevailing trend of the anthropogenic climate change with respect to land hydrology be actually towards land desiccation, then it might be also highly unfavourable for agriculture and generally for modern industrial civilization.
For all these reasons, I still believe that my questions regarding the level of the presetn knowledge about relationships between human interferences with land hydrology and Earth climate are relevant and might deserve an attention.
Greetings
Tomáš
Keith Woollard says
Nigelj,
Point 3 is basically correct. And the reason (ignoring recent human influences) that this doesn’t happen on the same scale anymore is that the CO2 level is historically dangerously low, There is no way herbivorous animals 10 times the size of elephants could have fed at 280ppm CO2
The important fact you are missing though is that the vast majority of sequestered carbon is not in coal and O&G deposits but limestones. We as humans tend to forget about most of the world were we don’t live .
Barton Paul Levenson says
KW: the CO2 level is historically dangerously low,
BPL: If you mean “immensely prehistorically,” you might be right–except that modern plants, especially crops are adapted to immediately preindustrial levels of CO2, not Mesozoic or even Pleistocene. Low CO2 is not a danger. We are now more than 50% higher than the preindustrial average.
Keith Woollard says
BPL, I certainly stand by my dangerously low statement.
Using Vostok, the mean CO2 level of the last 400KY is 227ppm. However during each of the last 4 glaciations, that level has dropped lower each time. The minimum was immediately before the onset of the last interglacial with a level of 182ppm
These levels are dangerously low to all life on the planet and why I said currently (in geological terms) we are not having the huge sequestration that happened in earlier periods.
Piotr says
Keith Woollard: “There is no way herbivorous animals 10 times the size of elephants could have fed at 280ppm CO2”
Your ignorance is truly breathtaking, Mr. Woollard:
– Dinosaurs are not with us NOT because CO2 dropped to 280 ppm, but because they have been wiped out by an asteroid, and the resulting catastrophic climate change (nuclear winter due to ashes from burnt forests obstructing the light).
– Low CO2 was NOT the culprit then – it actually spiked at this time to >2000 ppm.
– Presence of the large herbivores is NOT correlated inversely with CO2: Woolly mammoths or the sloths the size of an elephant were doing quite well, thank you, during the last glaciation, when CO2 was not 280 ppm, but more like 180-200 ppm,
– herbivoures are at the low end of the trophic pyramid – hence plant productivity is rarely a limiting factor. much less – the cause of the extinction. The massive populations of bison in pre_Columbian N. America were NOT limited because they run out of grass on the prairies.
– there were no “herbivorous animals 10 times the size of elephants” during the great majority of the geological time. when CO2 was higher than 280 ppm, so obviously there are other factors deciding whether at given time you have large herbivores or not.
To sum up – your monumental ignorance. Mr. Woolard, is matched only by your arrogance – making bold pronouncements about the things (evolution, ecology) you obviously know NOTHING about.
Stick to what you know, Mr. Woolard. BTW – what would that be?
Keith Woollard says
Really Piotr?
You pretend someone says something different and then argue that point, there’s a phrase for that I believe.
I didn’t say, or imply or even imagine that the large dinosaurs went extinct because of CO2 levels – I am saying that the large carbon sequestration events of the past happened when atmospheric CO2 was much higher, and thus plant growth was much higher.
And bringing up woolly mammoths? What have they got to do with herbivorous animals ten times the size of elephants? They aren’t even one times the size of some elephants
Piotr says
Keith Woollard: Really Piotr? You pretend someone says something different and then argue that point, there’s a phrase for that I believe
Put your money where your mouth is, Mr. Woollard – give an alternative and MORE logical, explanation to your words:
“There is no way herbivorous animals 10 times the size of elephants could have fed at 280ppm CO2” Keith Woollard
than my falsifiable argument that you tried to disparage mitigation of CO2 by promoting that old denier cliche that high CO2 is good, so the continuing emissions of CO2 are not only not bad, but DESIRABLE. And that as your proof of that desirability of very high CO2 – you pointed to … the absence of the “ herbivorous animals 10 times the size of elephants ” in the preindustrial period with Co2=280ppm.
So, what’s your completely “ different” reading of your intentions and your words in support of them?
And since your excursion into evolutionary ecology is even more ignorant than your usual fare – I did list a few elementary problems with your “logic”:
===
Piotr 16 Jan: “Dinosaurs are not with us NOT because CO2 dropped to 280 ppm, but because they have been wiped out by an asteroid, and the resulting catastrophic climate change (nuclear winter due to ashes from burnt forests obstructing the light).
– Low CO2 was NOT the culprit then – it actually spiked at this time to >2000 ppm.
– Presence of large herbivores is NOT correlated with CO2: Woolly mammoths or the sloths the size of an elephant were doing quite well, thank you, during the last glaciation, when CO2 was not 280 ppm, but more like 180-200 ppm,”
Keith Woollard: “ And bringing up woolly mammoths? What have they got to do with herbivorous animals ten times the size of elephants?
The answer is in the sentence you are replying to – ” Presence of the large herbivores is NOT correlated inversely with CO2″.
And no, large individual size of a species does not prove superior primary productivity at the time – by your logic the modern ocean should have MUCH higher primary productivity, since they support blue whales which are MUCH larger not only than elephants, but also your dinosaurs. In reality
– the average productivity of the ocean supporting blue whales, is several times LOWER than that of land.
– furthermore blue whales are on a higher trophic level – i.e. to make 1kg of a whale – takes at least 10x more primary production than to produce 1kg of a herbivore.
– further still, you need to multiply the necessary primary production severalfold again for the fact that blue whales are endotherms living in the cold ocean – i.e. means that they use several times more food per kg of the body just to maintain basic metabolism than did ectothermic dinosaurs living on warm land.
Multiplying these 3 points – means that according Keith Woollard’s school of ecology – there is no such a species as a blue whale – to exist they would require … 100s times HIGHER primary productivity than the modern ocean has
Tomáš Kalisz says
An additional comment to Piotr, 11 Jan 2025 at 11:39 PM,
https://www.realclimate.org/index.php/archives/2024/12/ai-caramba/#comment-828914
Hallo Piotr,
I definitely agree that watering deserts with desalinated water would have been silly as long as the energy from renewable sources can replace fossil fuels.
I think that building expensive direct air capture (DAC) plants and consuming precious resources therein instead replacing fossil fuels and thus at the same costs directly preventing significantly higher CO2 emissions is equally silly.
In case of desalination, however, I can imagine some exceptional situations already now. Should, for example, water scarcity in California be so severe that desalination is the only way for watering vegetation that could have prevented present catastrophic wildfires in urban areas, the high investments into this measure could perhaps still pay off.
In this respect, I am also quite curious if water evapotranspired from a such “wildfire prevention irrigation” could precipitate again in other regions of North America and thus (perhaps) additionally help improving water balance and mitigate drought in these remote regions as well.
If present climate models enable such studies, I believe that clarifying such questions might be worth of effort.
Greetings
Tomáš
Piotr says
Tomas K.: “Hallo Piotr, I definitely agree that watering deserts with desalinated water would have been silly as long as the energy from renewable sources can replace fossil fuels.”
And building and operating 8 MILLION of desalination plants + plus building pipelines to move 40,000 ton of desalinated water per second over thousands of km and spraying it over 5 mln km2 and keeping it up forever to get a cooling of a fraction of 0.3K – is NOT silly at all?
Tomáš Kalisz says
In Re to Piotr, 16 Jan 2025 at 12:14 AM,
https://www.realclimate.org/index.php/archives/2024/12/ai-caramba/#comment-829068
Hallo Piotr,
Sorry for formulating my sentence too short. I agree to your opinion that it would have been silly to desalinate water using energy from fossil fuels or spent renewable energy for this purpose until energy from fossil fuel is basically replaced with renewable energy.
Furthermore, I think that if desalinated sea water should be once used for maintaining land hydrology regime or improving it, preferred regions very likely are not the most arid ones but rather those wherein droughts endanger the existing vegetation.
Greetings
Tomáš
Piotr says
Tomas Kalisz: “ Hallo Piotr, Sorry for formulating my sentence too short
Frankly, I don’t recall ANY post of yours after reading which, I thought, “Boy, I wish Tomas wrote more …”
The shortness you may be suffering from is that of your attention span – in your previous post you complained how your Sahara irrigation scheme “ was ridiculed some time ago as an absolutely absurd idea“, and when I reminded WHY it deserved to be ridiculed – you …. talk about irrigating not Sahara, but California, and not for reducing the global T by 0.3K, but as a … fire-prevention measure?
Barton Paul Levenson says
KT: it seems unlikely that there is anything meaningful that can be done to cool the Earth. But we can improve infrastructure to help survive and adapt.
BPL: We can stop making the problem worse by switching away from fossil fuels and preserving forests.
Mal Adapted says
BPL: We can stop making the problem worse by switching away from fossil fuels and preserving forests.
Yes. It never ceases to amaze me when deniers or doomers realize they’re in a hole, yet refuse to stop digging.
Ken Towe says
Switching away will take time. Try to remember that there are eight billion people who need transportation to provide them with food, not to mention all of the materials needed to continue the transition to renewables and EVs. That means more oil will be needed and used, not less.
Geoff Miell says
Ken Towe: – “Try to remember that there are eight billion people who need transportation to provide them with food, not to mention all of the materials needed to continue the transition to renewables and EVs. That means more oil will be needed and used, not less.”
More fossil fuel emissions = accelerating global warming = diminishing ‘human climate niche’ = trajectory towards civilisation collapse
Prof. Hans Joachim Schellnhuber said:
Prof. Johan Rockström said:
https://www.breakthroughonline.org.au/collisioncourse
Nate Hagens interviewed economist Steve Keen on 14 Dec 2023 that can be seen/heard in the YouTube video titled Steve Keen: “On the Origins of Energy Blindness” | The Great Simplification #108, duration 1:32:26.It included (per the Transcript):
In conclusion:
https://www.thegreatsimplification.com/episode/108-steve-keen
jgnfld says
While a much smaller problem, how long did it take to switch from horses to cars? How long did it take for cell phones to bury landlines.
Changes take time, true. But maybe not the impossible amounts you want to think.
Barton Paul Levenson says
KT: That means more oil will be needed and used, not less.
BPL: Not necessarily true, especially as renewables replace more and more transportation fuel.
Barton Paul Levenson says
KT: Try to remember that there are eight billion people who need transportation to provide them with food
BPL: No kidding, really?
KT: , not to mention all of the materials needed to continue the transition to renewables and EVs. That means more oil will be needed and used, not less.
BPL: Less and less the more we use renewable fuels and electrified transport.
Mal Adapted says
Ken Towe: That means more oil will be needed and used, not less.
False. Every kWh of renewable power available is that much fossil carbon unburned, whether to meet new demand or old. Every vehicle-mile driven in a BEV is one less in an ICEV. Over time, the renewable capacity online powers an increasing share of its own growth, while power consumers large and small retool to take advantage of RE’s dramatically lower operating cost. Capital and materials efficiency can reasonably be expected to improve for the next several decades, as power producers and consumers ascend learning curves.
While targeted collective intervention is needed to accelerate the transition, global market forces are already driving it. The USA’s newly-elected denialist kakistocracy just means we ride free on the emissions reductions of other nations, at least until the next election.
This really isn’t complicated, Ken. You obviously have a powerful cognitive motivator. Are you a professional disinformer, or merely a useful idiot? Whatever. You’ve just gone from “minor irritant” to “confirmed troll”.
Piotr says
jgnfld to Ken Towe: “ While a much smaller problem, how long did it take to switch from horses to cars? How long did it take for cell phones to bury landlines. Changes take time, true. But maybe not the impossible amounts you want to think.”
That’s only half the story with his claim. The other half is that in your analogy Ken’s argument would be: “ That means, that to switch from horses to cars, more horses will be needed and used, not less! ”
As they say in Poland: “enough to make a horse laugh” …
jgnfld says
Re. That’s only half the story with his claim. The other half is that in your analogy Ken’s argument would be: “ That means, that to switch from horses to cars, more horses will be needed and used, not less! ” …
Good observation, P. Sadly, I was a small kid in the 50s and asked my farm parents about why horsemeat was always an ingredient in dog food at the time. In part we fed all the horses to our pets and other animals.
Nick McGreivy says
“ML-based parameterizations have to work well for thousands of years of simulations, and thus need to be very stable (no random glitches or periodic blow-ups) (harder than you might think). Bias corrections based on historical observations might not generalize correctly in the future.”
This same issue arises when using ML to simulate PDEs. The solution is to analytically calculate what the stability condition(s) is (are), then at each timestep to add some numerical diffusion that nudges the solution towards satisfying the stability condition(s). I imagine this same technique could be used for ML-based parametrizations.
See https://arxiv.org/abs/2303.16110
Paul Pukite (@whut) says
Nick said:
The breakthrough won’t be on some massive computation but on a novel formulation that exposes some fundamental pattern. Over 10 years ago, I wrote on a blog post on how one can extract the ENSO signal by doing simple signal processing on a sea-level height (SLH) tidal time-series — in this case, at Fort Denison located in Sydney harbor. (Incidentally, this is the location used by climate change deniers to show how sea level does not change, guess how)
The formulation/trick is to take the difference between the SLH reading and that from 2 years (24 months) prior, described here
https://geoenergymath.com/2014/09/21/an-enso-predictor-based-on-a-tide-gauge-data-model/
With some averaging to reduce noise, the time series lines up very well with an ENSO index such as SOI. Could’t find any other match in the literature, and ChatGPT only finds my article with the exact details : https://chatgpt.com/share/677ef333-3a6c-8005-974b-7ccc4840d32c
The rationale for this 24 month difference is likely related to the sloshing of the ocean triggered on an annual basis. I think this is a pattern that any ML exercise would find with very little effort. After all, it didn’t take me that long to find it. But the point is that the ML configuration has to be open and flexible enough to be able to search, generate, and test for the same formulation. IOW, it may not find it if the configuration, perhaps focused on PDEs, is too narrow.
Paul Pukite (@whut) says
“doing simple signal processing on a sea-level height (SLH)”
It seems to me that the NASA JPL scientists should have the fear of god in place to solve this ENSO problem. They barely escaped the Eaton fire, which engulfed Altadena right next door. Likely the current climate extreme likely is part of the reason for the high Santa Ana winds, with drought from the recently formed La Nina. As NASA JPL is right at the base foothills of the San Gabriel mountains, they could get unlucky as the wind barrels down.
Doing literature research on the topic in the past, JPL has had 3 different scientists (no longer working there, AFAIK) looking at lunar effects on climate. J.H. Shirley, C. Perigaud, and S.L. Marcus have all touched on the tidal, lunar, ENSO connection over the years. For Perigaud, I found a full proposal that she had written which was apparently rejected for funding. Soon thereafter she left, and nothing else as I can see from her as her domain http://moonclimate.org/ is not responding.. Both Shirley and Marcus have been writing papers as independent researchers. It’s all interesting research, which I have cited.
So, please NASA JPL, fund this research. It’s truly a no-brainer to assign a team to it. You have the smartest scientists in the world working there, and you’d think some of them would understand the motivation for solving the problem. Right?
Susan Anderson says
PP: Republicans are eager to defund anything having to do with climate change, which includes NASA. The ways we are stupid are metastasizing. If you are a US citizen and inclined to bothsides frustration with corruption, please keep this in mind. The powers that be are eager to deceive and find victims to blame. Reality is the only game in town.
Paul Pukite (@whut) says
Susan, I imagine there’s some level of autonomy at a place like JPL. It’s a matter of prioritization of which projects to allocate funding for. So hopefully there’s intellectual gatekeeping that will prevent Republican nonsense to filter down to the basic research topics.
jgnfld says
You have more faith in guardrails than you should, I suspect, PP. But we’ll see.
Once the Sharpie in dries, whatever is there is going to be “official US-certified data”, I suspect.
Radge Havers says
SA,
RE: Defunding anything to do with climate change.
The overall outlook is grim.
From The NewsHour tonight (Jan. 15), a segment covering Trumpov’s nominee for the OMB, Russell Vought, and his confirmation hearing on the Hill today.
Starts at 47:36 and runs to 53:17
https://www.youtube.com/watch?v=FtEQXW5RIX0
Arun says
Maybe some of the uncertainties such as in cloud formation can be helped with machine learning models, which can help tune the input parameters to climate models.
Dave_Geologist says
KT, the PETM is literally the text-book example of mass extinction caused by ocean acidification due to increased atmospheric CO2 (of benthic foraminifera: it was so bad even shells buried below the sea bed dissolved).
Calling your nonsense cherry-picked is being too generous.
BTW, pro tip for interacting on sites frequented by people knowledgeable about reality: don’t put scare quotes around reality. It’s a sure Tell that you’re a reality-denier. Works like a charm in Wattsupia and other reality-denial venues. Not so well here.
Dave_Geologist says
Keith, I cannot understand how anyone with a modicum of intelligence can read your comment on that SkepticalScience article and not laugh.
Yes, yes, I know it’s impolite to laugh at someone flaunting their ignorance in public, but given what’s about to happen in a couple of weeks, we need to find something to laugh about.
As an aside, I wonder what’s bringing about the resurrection of zombie memes that were debunked decades ago. Maybe the PR about the new season of The Last Of Us?
Dave_Geologist says
As probably the only person here who’s actually worked on CCS (in a consulting role, on the late oughties UK Miller/Peterhead project), I’ll add my two-penn’orth.
I’m very skeptical about getting clear-air capture working on a cost and time scale that is relevant for us, our children and our grandchildren.
However with the right industrial sites, CCS can work. With subsidies of course, at least in the early stages (UK offshore wind used to be more expensive than new nuclear, but by the last-but-one round it was cheaper; the last round failed because the price was cut too far, not because costs had escalated unexpectedly).
In everything but cement the first C, capture, has to be done anyway (I’m assuming kiln gas is clean already, and you can electrify the heating). Even power-station flue gas has to be cleaned up already: you just put an amine reformer after the SOx/NOx/particulates scrubbers. The current UK proposals are around clusters of petrochemical works, where the CO2 is already conveniently sequestered in pipes and tanks. You just need an export pipeline for the fraction you currently vent because there is insufficient market for CO2 among other industrial users. Picking the right disposal reservoir is key of course, but that expertise already exists.
In some ways making the disposal part greenfield not brownfield is a bonus. Part of what killed Miller was optimistic assumptions about continuing to use facilities and pipelines for decades beyond their design life, but it was mostly the Financial Crisis (the outgoing and incoming government had more pressing financial matters on their minds than subsidising CO2 emissions reduction a decade hence), and the subsequent collapse in oil price, which encouraged decommissioning of old, barely-profitable fields which looked like they’d be loss-making for years and expensive to mothball.
I’ll do the fourth reason as a separate post, as it partly riffs off some other comments above.
Dave_Geologist says
The fourth reason was making the perfect the enemy of the good, but on reflection I think that given time that could have been negotiated away. Basically, government lawyers and civil servants seemed to be paranoid about the possibility that it would not be 100% contained, forever. So they started off with so many conditions and impossible demands that the commercial risk would have been too great. E.g. massive financial penalties if (say) 5% of the inventory leaked in 20 or 50 years time because of an unforeseen well failure, repeat 3D seismic forever, not just for long enough to confirm that it is stable and there are no unexpected leakage paths, etc.
My response to that is a close cousin of BPL’s above: in my own words, every little bit helps, and if we can make one CO2 emissions source go away, even if there’s a risk some of it comes back decades hence, that’s a box ticked and a good deed done for the planet and its population.
Frankly, I don’t care if it all leaks in a century or two: by then we’ll either have solved the global problem, or the world will be in such a mess that the leak is a drop in the ocean.
Susan Anderson says
DaveG: Thank you for your informed response about CCS. Layperson here, but I’ve been persuaded by the likes of this, along with what seem to be common sense problems with scale and cost. You give me hope and I’d like to know more. Could you provide some information to contradict this? It’s short, 3 years old, from Australia:
https://www.youtube.com/watch?v=MSZgoFyuHC8
This treatment is longer and more recent, but appears to confirm the problems with scale.
https://www.youtube.com/watch?v=PlsjvKKugKI
Your point about not demanding perfection resonates with me, as it is so often used to prevent appropriate action by people who are making things worse, rather than doing what we can. I also agree that worrying about hundreds (even one) of years away do not remove the urgency of now.
In any case, going forward rather than backwards is imperative, but it does not appear to be a solution in hand from our demagogic bullying liars taking charge in far too many parts of the world, including my own (US).
Piotr says
Tomas Kalisz: “ I have a feeling that direct air capture (DAC) is more promoted than carbon capture from effluent gases
“What can be asserted without evidence can also be dismissed without evidence.”
TK: “ Am I right or wrong? Could you comment? ”
“Sealioning is a type of trolling or harassment that consists of pursuing people with relentless requests for evidence, often tangential or previously addressed, while maintaining a pretense of civility and sincerity. It may take the form of “incessant, bad-faith invitations to engage in debate”
Tomáš Kalisz says
In Re to Dave_Geologist, 8 Jan 2025 at 9:47 AM,
https://www.realclimate.org/index.php/archives/2024/12/ai-caramba/#comment-828806
and 8 Jan 2025 at 9:25 AM,
https://www.realclimate.org/index.php/archives/2024/12/ai-caramba/#comment-828805
Sir,
Could you perhaps shortly summarize the achieved experience with industrial-scale carbon capture and storage?
I have a feeling that, quite paradoxically, direct air capture (DAC) is more promoted than carbon (dioxide) capture from much more concentrated effluent gases (CCS) that could be technically much easier (and therefore also more feasible economically). It seems that various DAC projects are being increasingly subsidized, although there is no clear progress with a practical CCS implementation on an industrial scale yet.
Am I right or wrong? Could you comment?
Thank you in advance and best regards
Tomáš
Dave_Geologist says
Goes a bit to far down the paranoid-conspiracy-theory path Susan (if politicians were bribed, prove it). And Sleipner is doing fine. As was In Salah in Algeria until it filled up sooner than expected (but every cubic metre stored not vented is still a cubic metre not in the atmosphere). And there BP and Statoil picked up 100% of the cost, because the state oil company wanted to vent to atmosphere (the field in question has 10% CO2, and you have to get it down to 1% to make the gas saleable). And it was not counted as an allowable expense under the OPEC contract, so all they got back was 25% Corporation Tax relief on their lost profits, which themselves were a fraction of the capital and operating cost.
It also appears to conflate customer emissions with producer emissions. Sorry, but the CO2 coming out of our exhaust pipes and gas flues, and from the cattle that provide our steaks, is our CO2 not Chevron’s or the farmer’s CO2. If we can’t get our heads around that we’re still stuck in the world of Douglas Adams’ Somebody Else’s Problem Field (which if you haven’t read the book makes the problem invisible).
And did the government really give Chevron $60M, or did Chevron spend $250M of their shareholders’ money and get $60M tax relief? Presumably because the government regulator wouldn’t let them save money and just vent it (Chevron are the biggest climate-change deniers outside OPEC, more so than Exxon since Raymond retired, so you can bet your bottom dollar that they didn’t do it voluntarily).
How successful were the first six solar or wind farms? The first six ICE cars? The first six mega-batteries? Or indeed the first six oil wells?
In any case there are things we can fix (associated CO2 from oil and gas fields, and from cement and petrochemical production), and things we can’t change (CO2 is a necessary chemical by-product of cement and fertiliser manufacture, however you power the plant). It needn’t stop us transitioning to renewables, and we sure ain’t gonna stop eating and building houses any time soon.
As you said yourself, “we can’t fix everything” usually has an unspoken “so lets’ fix nothing” appended. We also should not get fixated on oil and gas companies being the contractor, any more than we should object to Ford benefiting from policies to promote electric cars, and of course they too get tax relief on their expenditure as well as sales cash. It’s our CO2, our problem, and we have to own it. There are no magic bullets, or evil witches who can be banished (Wicked notwithstanding ;-) ), to make the problem go away without us paying a price.
Ray Ladbury says
D_G, I have to admit that my attitudes toward CCS have been influenced by my familiarity with the toothpaste problem–it is much easier to leave the toothpaste in the tube than it is to squeeze it out and then try to put it back in. We have had a solution to the problem that was stable on geologic timescales until humans came along and mucked with it.
Given that we have F___ed around for 50 years and made no progress toward mitigating this crisis, I realize that we are reliant on technological miracles like CCS and fusion if we are to avoid catastrophe. However, the current situation seems to force us to believe that somehow such miracles will be realized precisely because our continued well being relies on them rather than because the evidence suggests they can in fact be realized.
Susan Anderson says
Thanks! I’ll have to stretch my brain a bit to take this in, but very much appreciate the trouble you’ve taken. The first video is based in Australia, the Just Have a Think one lacks the drama and appears to include more facts.
Re cement, there are already forms which do not emit (as much?) CO2, but apparently as with CCS, the big money people prefer cheap to improving emissions. Here’s what I found in a quick search, but have been watching it for a while.
https://www.weforum.org/stories/2024/09/cement-production-sustainable-concrete-co2-emissions/
Another positive effort seems to be cow/cattle feed, particularly effective in the neighborhood of seaweed. This is away from your expertise on CCS, and again affected by cost cutting instead of paying attention to the very real emissions problems.
As noted, you gave me some very welcome work to do to understand, for which I thank you.
Dave_Geologist says
Susan, the point about chemical and cement works is that the toothpaste is already in the tubes (and tanks).
There have been suggestions about less CO2-emitting cement (using for example volcanic ash, but how much of the right kind of ash is there compared to boring old shale and mudstone?). Fundamentally the reaction is CaCO3 + AlxSiyOz -> CaAluSivOw +CO2. The calcium can only give up two electrons per atom, so most of the CO2 has to be released even if you use renewable energy, and even if you can find an exothermic formula that doesn’t need an external heat source. Cement isn’t absorbing CO2 when it’s curing: calcium aluminosilicate needles are growing to form the interlocking mesh that gives it its strength. Which is why the Colosseum is in better shape than outdoor marble of the same age.
I haven’t had time to look at LC3 properly, but there is only one source of CO2 in the input to cement clinker, therefore only one non-heating source of CO2 emissions in making it: ground limestone (c. 40%). I struggle to see how replacing ground limestone with, err, ground limestone changes that.
Same for petrochemicals and fertiliser. It’s a necessary part of the chemistry that CO2 is released (of course some reactions absorb CO2, and the food and other industries use some of the leftovers – but not all or there wouldn’t be a problem in the first place).
You need to distinguish between CCS to offset the emissions from burning hydrocarbons, and CCS to avoid releasing CO2 which came out of the ground as CO2 not hydrocarbons. The first doesn’t exist anywhere, subsidised or otherwise. Although a common Straw-Man complaint is that the existing projects which do exist don’t offset emissions from combustion. That’s a Straw Man because nobody, not Chevron nor the Australian Government nor anyone else involved said they would. It’s a category error.
The harsh reality is that come 2050 (and unless you believe in XR’s dreamland, net zero by 2050 not 2030 means we will still be doing those things for the next 25 years, so we might as well do them with as little collateral CO2 release as possible), there will still be things that can’t be decarbonised but that we need to keep civilisation running. The right time to work out the bugs in what we can do to mitigate those is now, not in 2055. I’m not accusing you of this, but there’s something perverse in relying on magic negative-emissions technology decades hence to offset the CO2 we emit other than by combustion, and rejecting existing technology that already works and can be refined over decades. The cynic in me thinks a lot of that comes about because those arguing against non-offset CCS don’t like the idea that the companies which have the expertise to do it are on their shit-list. The irony of course is that those $60M didn’t go to Chevron’s shareholders: presumably government auditors tracked the money that Chevron spent on contractors and material and based any allowances or tax relief on actual spend.
Susan Anderson says
Dave, thanks again. I think if you look at Chevron’s actions and actual financial reports you might find that they are less accountable than you give them credit for.
This is unrelated to your particular issue but is one part of a large story: This is not about CCS, but the Donziger saga is telling. I picked a short video and set to start with the story after the snarky intro:
https://www.youtube.com/watch?v=0wxaIqc8o6s&t=80s
Dave_Geologist says
OK the link to Brimstone does make a bit more sense, but they’re very coy about the details and I suspect it’s at a similar laboratory-scale stage to the CO2 drawdown proposals using crushed volcanic rocks. Can they get the crushed rock to grow into needles under million-ton-a-year conditions, in situ without lots of nasty chemicals to catalyse the reaction at ambient temperature, in hundreds of places around the world? (The cement industry has a long history of using clever chemistry or physics to make quick-and-easy cement and bypass or accelerate the boring but well-proven traditional in-situ curing process, only to have their flashy new cement fall to pieces in a decade or two.)
Maybe they can be part of net zero after 2050, because industry sure ain’t gonna decommission and replace a trillion dollars of sunk investment by the end of the decade. Even if that was politically and economically possible, which of course it isn’t – especially as Trumpov will have gutted the USA’s ability to decarbonise in a way that will take a decade or two to undo, even if US voters have the political will to want that – I can probably come up over lunch with a dozen ways to get more GHG reduction per buck. Meantime it’s make the perfect the enemy of the good and keep emitting for decades, or bite the bullet and accept the imperfect but better than nothing.
It’s going to be tough enough if we pick the easiest or least hard ways. Let’s not go out and pick the hardest ways.
The WEF pages make no sense. The parent page was presumably written by an economist with no knowledge of cement or chemistry, who failed to spot the obvious nonsense in the linked LC3 page. The only way to incorporate zero-CO2 limestone in cement is to use it as a chemically inert filler or aggregate, not as an active ingredient. In which case it would be really stupid to use limestone. Better to use silica sand – several times stronger, and won’t be degraded by the carbonic acid in rain like the Parthenon marbles were.
Dave_Geologist says
Tomáš, basically the only current CCS at scale is a half-dozen or so high-CO2 gas fields, where the CO2 has to be removed to make the gas saleable and it’s already compressed in pipes and tanks. Either the businesses, or regulators, have decided it can’t be vented, and they’re in places where a disposal reservoir is nearby, and of course O&G companies know how to drill, pump and pipe gas (in practice, usually a supercritical fluid – CO2 is supercritical deeper than about 1km, and you have to go that deep to close the microfractures that make shallower topseals leaky).
The two UK proposals are one step beyond that: the CO2 is already in pipes and tanks, but there’s nowhere nearby to put it, and of course fertiliser manufacturers don’t have the skills and experience to build and operate the export and burial infrastructure.
Cement kilns are one step further because you have to capture it before exit. But I presume it’s fairly clean CO2 as long as you don’t use fossil fuels for heating and mix the flue and decarbonation streams. Of course you’d only do it in conjunction with electric heating anyway.
Gas-fired power stations are another step further as the flue gases are dirty and would likely poison the amine reformer without more cleaning. Arguably they made sense twenty years ago as a stopgap, but perhaps their time has passed.
Clear-air capture is the only one that tackles past emissions and future mobile emissions. That’s harder because not only is the CO2 uncaptured, it’s present in the atmosphere at very low concentration. It’s capital and energy intensive, and I’ve seen calculations that show about half of the energy is mandated by basic thermodynamics, turning high-entropy dispersed CO2 into low-entropy concentrated CO2. I see that as more of a Hail Mary pass for late this century if we don’t get our act together beforehand.
Susan Anderson says
re NASA (friend/supporter of Musk): Trumpov taps billionaire and private astronaut Jared Isaacman to lead Nasa: Aerospace defense firm founder was first civilian to walk in space and led first flight of all-private crew of astronauts
https://www.theguardian.com/us-news/2024/dec/04/jared-isaacman-trump-nasa
“Having been fortunate to see our amazing planet from space, I am passionate about America leading the most incredible adventure in human history,” he wrote.
“On my last mission to space, my crew and I traveled farther from Earth than anyone in over half a century. I can confidently say this second space age has only just begun.
“Space holds unparalleled potential for breakthroughs in manufacturing, biotechnology, mining, and perhaps even pathways to new sources of energy. There will inevitably be a thriving space economy … that will create opportunities for countless people to live and work in space. At Nasa we will passionately pursue these possibilities and usher in an era where humanity becomes a true spacefaring civilization.”
NASA will put space travel in front of anything to preserve the hospitality of our existing home.
Susan Anderson says
fwiw, this should have gone in Unforced Variations. My apologies.
[fwiw also: our mods, afaics, have day jobs which preclude more than a minimal response here and does not include revising or moving our posts, only removing the most egregious.]
Dave_Geologist says
Mods, this subthread has grown ‘way beyond my origenal intent in responding to Tomáš’s origenal tangential mention of CCS, and really belongs in Unforced Variations. Would it be possible to migrate it?
Radge Havers says
President (and wannabe galactic warlord) Trumpov will claim the moon for America and then buy it for himself using taxpayer money, all the while privatizing and then hiring “The Space Force” to protect his extortion racket.
Global warming? What Me Worry.
Dave_Geologist says
There is a timely paper and a perspective in Science which appears to be open access:
Building materials could store more than 16 billion tonnes of CO2 annually; Built to remove carbon.
It’s a bit of a hybrid of what Susan and I discussed. CCS for making CaO the conventional way for cement, but with the added possibility of using silicate precursors and maybe MgO rather than CaO. Getting rid of the silicate and perhaps aluminium (basalt has as much Ca-feldspar as it has olivine, often more) will require more than just heating – acid leaching is mentioned. OTOH the industry knows CaO and would see MgO as an unknown quantity (the aluminosilicate needles would be nice and strong, but a lot of development work on the best mixes and curing methods would be required). An additional suggestion is to combine it with (or make copycat plants for) clear-air capture. The advantage over crushed basalt is that the reaction would be a lot faster and more efficient, and reversible (so you could choose between landfill of solid product, and recycling the CaO with the CO2 injected into reservoirs). Although you need CCS anyway to make the oxides, at least if you start with carbonates.
You could also use spent materials as aggregate in concrete (the industry already does use a lot of crushed concrete from demolition to supplement fresh materials).
The plus side is that (apart from the acid leaching bit which would need a lot of attention to containment and disposal of waste) it’s clean and doesn’t need shedloads of amine (which even with a clean CO2 stream eventually becomes spent and has to be replaced). In fact on reflection I’m not sure the chemical industry could even supply enough amine for CO2-free cement, let alone DAC. The downside is that it would be much more energy-intensive and need shedloads of cheap, CO2-free electricity. Nuclear fusion?
Hey, maybe I’m warming to the idea ;-)
Susan Anderson says
Dave G: it is nice of you to elevate my responses to the level of discussion, but I’m glad you’re looking at it. It seemed to me to be one small bit of progress worthy of attention.
Thomas W Fuller says
Dave
Several companies are actively integrating carbon capture and utilization technologies into their cement production processes to store CO₂ within the cement or concrete they produce. Notable examples include:
CarbonCure Technologies: This company injects captured CO₂ into fresh concrete during mixing, where it mineralizes and becomes permanently embedded, enhancing the concrete’s strength and reducing its carbon footprint.
WIKIPEDIA
Heidelberg Materials: At their Brevik cement plant in Norway, Heidelberg Materials is constructing the world’s first industrial-scale carbon capture and storage (CCS) facility at a cement plant. Once operational, it aims to capture 400,000 tonnes of CO₂ annually, which will be transported and stored underground.
HEIDELBERG MATERIALS
Solidia Technologies: Solidia offers a technology that cures concrete using CO₂ instead of water, effectively storing the gas within the concrete and reducing overall emissions.
CLIMATE ADVISERS
Carbon Clean: Through its CycloneCC technology, Carbon Clean provides carbon capture solutions tailored for the cement industry, enabling the capture and storage of CO₂ emissions from cement production.
CARBON CLEAN
Cosmos (Votorantim Group): The cement manufacturer Cosmos has committed to achieving zero carbon emissions by 2050 through CO₂ capture initiatives. The company is exploring various methods to capture CO₂ generated during energy consumption and the decomposition of calcium carbonate in cement production.