Computer-based climate models are important tools in climate science that allow us to understand complex planetary systems and predict how they might change in the future.
Climate modelling is done with powerful computers which are capable of turning huge amounts of raw data (on temperature, greenhouse gases, deforestation, coral bleaching, and so on) into calculated predictions about future global warming and its effects on the ecosphere.
Scientific institutions such as the NOAA Geophysical Fluid Dynamics Laboratory (Princeton), the National Center for Atmospheric Research (Colorado), the NASA Goddard Institute for Space Studies (Washington DC), the Institut Pierre Simon Laplace (Paris, France) and the Met Office Hadley Centre (Exeter, UK) are totally reliant on climate models to understand the climate system and test hypotheses about climate action and the rate and severity of climate change. Model types vary in complexity from one-dimensional “energy balance models”, to 3-D “general circulation models” and the even more complex “atmosphere-ocean general circulation models”, known as AOGCMs.
Over the past five decades, climate modelling has become an essential component in the scientific advice given by the Intergovernmental Panel on Climate Change (IPCC) to the United Nations Framework Convention on Climate Change (UNFCCC). Witness, for example, the IPCC’s Special Report on Global Warming of 1.5°C (SR15), whose “pathways” to achieving the various temperature targets were based entirely on complex computerized climate models.
Climate Models Need Satellite Data
A global climate model is essentially a large software program capable of analyzing vast amounts of quantitative data about Planet Earth and its atmosphere. 1 Quantitative data means numerical measurements of things like temperature, humidity, size of aerosols in the air, CO2 levels, the distance a glacier has retreated, extent of Arctic sea ice, and so on.
Software models vary enormously in scope and complexity – from those that cover one single region of the world, to those that simulate (e.g.) the entire atmosphere or land area of the planet. The biggest models are run by massive supercomputers, like the three Cray XC40 supercomputers used by Britain’s Met Office Hadley Centre, which together can perform 14,000 trillion calculations a second. 2 3
But no matter how powerful, they all depend upon accurate research data. Fortunately, with the development of climate satellites, oceanographic exploration vehicles, and techniques like carbon dating, scientists are now able to produce large quantities of high-quality data on everything from the ozone layer in the stratosphere and the rate of deforestation in the Amazon, to the weather patterns of the Stone Age. 4
How Do Climate Models Work?
In simple terms, climate models employ mathematical equations to represent the components that drive Earth’s climate: including its atmosphere, oceans, land masses and ice-covered regions. Specific data might include numerical values for surface temperature, land use, glacial fluctuations, albedo or reflectivity, shifts in air or ocean currents, or atmospheric chemistry. 2
The models are programmed to follow all the scientific laws governing the physics of the climate system, like the law of thermodynamics (which states that in a closed system, energy cannot be lost or created, only changed from one form to another) or the Stefan-Boltzmann Law (regulating how much radiant heat is emitted from a surface), as well as equations such as the Clausius-Clapeyron equation (about air temperature and water vapor pressure), and the Navier-Stokes equations (about the characteristics of atmospheric gases and sea water). All this information is turned into pages of computer code, usually compiled using Fortran or another computing language. 2
What Is Spatial Resolution?
Models divide the Earth into a grid-like series of boxes. In a basic model the boxes may represent 230 square miles (600 square kilometres) of land, air, or sea. More complex climate models use a three-dimensional grid, with each box representing around 25 cubic miles (100 cubic kilometres) or less. The size of each box is known as its “spatial resolution” – the smaller the box, the higher the resolution and the more specific climate information it will yield.
What Is A Time Step?
Time is typically programmed into climate models using the concept of a “time step” – how often the model recalculates the climate – typically 30 minutes when analyzing the atmosphere. As with the size of physical “boxes”, the smaller the time step, more detailed climate information the program will yield. But as usual, the more detailed the information, the larger the number of calculations and thus computer power needed.
In a TED talk in 2014, Dr Gavin Schmidt of NASA’s Goddard Institute for Space Studies, described how a climate model can produce a detailed simulation covering everything from the evaporation of moisture from the ocean’s surface and the formation of clouds, to the precise area where the wind carries them and where the rain finally falls. But it can also show the detailed process involved in the thawing of Arctic permafrost and consequent release of methane into the atmosphere, which adds to global warming as part of a feedback loop.
In a nutshell, scientists feed millions of pieces of climate data and thousands of equations replicating the fundamental physical laws of Earth’s climate, into a computer model, which then produces a range of predictions (also called projections or “pathways”) concerning – among many other things – the flow of solar light energy to Earth and thermal infrared energy away from it; oceanic currents, the discharge of carbon from the land into the atmosphere, the level of fossil fuel pollutants and the rise or decline in the greenhouse effect; and ultimately the rate of global warming.
What Are The Basic Types Of Climate Models?
- The most basic climate models are Energy Balance Models (EBMs). EBMs are not concerned with climate simulation, but instead forecast climate changes as a result of Earth’s energy budget: that is, the difference between light energy entering the Earth’s atmosphere from the sun and the heat energy radiated back out to space. The only climate variable they calculate is surface temperature.
- More complicated are Radiative Convective Models (RCMs) (1-D or 2-D), which calculate the climate with modest spatial resolution and time-related detail, so they are best employed to examine large-scale and/or low-frequency changes in the earth’s climate system. Radiative Convective Models are able to determine the temperature and humidity of differing layers of the atmosphere, so they are ideal for calculating the transfer of energy (via convection as warm air rises) from the troposphere all the way up to the thermosphere, on the edge of space.
- Much more complicated are General Circulation Models (GCMs), which replicate the climate system in far greater detail. Up until recently, these were the most sophisticated models for climate change forecasts. GCMs can deal with data on anthropogenic greenhouse gases, fossil fuel emissions, heat transfer, land type, ocean acidity and glacial makeup of the “box” in question.
- The most complex climate model is the atmosphere-ocean general circulation model (AOGCM) – a mini-network of GCMs working together. The latest GCMs cover biogeochemical cycles, like the carbon cycle, the transfer of chemical compounds between living things and their environment – and how this affects the climate system. Known as Earth System Models (ESMs), they can replicate the carbon and nitrogen cycles, variations in atmospheric conditions, marine ecosystems as well as land use, all of which determine how the climate responds to emissions of carbon dioxide.
- For climatic research into regions, scientists use Regional Climate Models (RCMs) which are much like GCMs, only they deal with a smaller area of land or ocean, which means they run faster and at a higher resolution than GCMs.
- One of the most interesting models is the Integrated Assessment Model (IAM). This is a simple climate program to which data on social issues – population, economic growth, energy consumption, land management – has been added in order to calculate the impact of humans on global warming and vice versa. 2
What Can Climate Models Tell Us?
The more complex climate models are capable of producing an extremely detailed picture of the Earth’s climate, based upon thousands of different variables, including: temperatures, humidity and chemistry of the different layers of the atmosphere, along with its air currents and cloud cover; temperatures, acidity (pH) and salinity of the oceans; amounts of sunshine, snowfall, rainfall, as well as the size of glaciers and ice sheets.
Climate models can also predict “climate sensitivity”. Meaning, they calculate to what extent climate feedbacks – like water vapour, thawing of permafrost and changes in Earth’s “albedo” (energy reflectivity) accelerates global warming due to greenhouse gas emissions.
Climate models face huge problems trying to make accurate predictions about climate (or weather), largely due to the sheer complexity of the Earth, its atmosphere and its oceans. For example, some of the main limitations faced by climate modellers include: how to represent the contradictory effects of clouds and aerosols, how to assess the future take-up of carbon dioxide by the ocean surface, and how best to represent the climatic effects of deforestation, as well as afforestation.
What Is CMIP?
As more and more institutions around the world start to create and run their own computer-powered climate models, there’s a growing danger that each group of scientists will adopt a slightly different methodology, thus reducing the comparability of their results, and thus the effectiveness of their advice to policymakers.
This is why the Coupled Model Intercomparison Project (“CMIP”) was developed. The CMIP was initiated in 1995 by the Working Group on Coupled Modelling (WGCM) of the World Climate Research Program (WCRP), as a collaborative framework in order to bring into line all the climate model experiments that different modelling centres were doing. It is similar in concept to the Atmospheric Model Intercomparison Project (AMIP) for global coupled ocean-atmosphere general circulation models. 5
Dr Chris Jones of Britain’s Met Office explains: “The idea of an ‘intercomparison’ came from the fact that many years ago different modelling groups would have different models, but they would also set them up slightly differently, and they would run different numerical experiments with them. When you come to compare the results, you’re never quite sure if the differences are because the models are different or because they were set up in a different way.” So, CMIP was designed to be a way to bring into line all the climate model experiments that different modelling centres were doing. 2
How Accurate Are Climate Models in their Projections Of Global Warming?
With climate change fast becoming a climate crisis, climatologists and policymakers depend upon computer models to make projections about global temperatures in order to formulate a suitable response. Accuracy is therefore paramount.
One of the earliest predictions of global warming was made in 1973 in an article published in Nature by John Sawyer (The UK Met Office). He projected that, by the year 2000, the world would warm 0.6°C. His forecast was exceptionally close – the actual warming over that period was between 0.51°C and 0.56°C.
The scientific body responsible for advising the United Nations Framework Convention on Climate Change (UNFCCC) is the Intergovernmental Panel on Climate Change (IPCC). Every six years or so, they compile an Assessment Report which constitutes the most authoritative up-to-date statement on climate warming of the moment. The award-winning environmental publisher CarbonBrief, tested the accuracy of the warming projections of all five of the IPCC’s Assessment Reports, with the following results:
- First AR (1990)
Overestimated warming by 17 percent.
- Second AR (1995)
Underestimated warming by 28 percent.
- Third AR (2001)
Underestimated warming by 14 percent.
- Fourth AR (2007)
Overestimated warming by 8 percent.
- Fifth AR (2014)
Overestimated warming by 16 percent but only 9 percent when land/sea data blended. 6 7
These results demonstrate that climate models are pretty accurate and getting more so, which is a massive achievement. 8 Especially, since greenhouse gas emissions (the main driver of global warming) are dependent on socio-economic development. I mean, how on earth do you compute the effect of an extra 3-4 billion people on global temperatures (given the differing effects of fossil fuels and renewables) by the end of the century? 9 See also: Greenhouse Gas Statistics Lack Consistency.
Over the past century or so, human negligence has triggered an avalanche of environmental problems that now threaten the very existence of our planet. Let’s hope that climate modelling can help to persuade slothful decision makers around the world to take action before it’s too late.
- Climate Models: NOAA.
- “How do climate models work?” CarbonBrief. 15 January 2018. (3)
- “The Cray XC40 supercomputing system.” – Met Office (4)
- “Comparing Tropospheric Warming in Climate Models and Satellite Data.” Benjamin D. Santer, Susan Solomon, Giuliana Pallotta, Carl Mears, Stephen Po-Chedley, Qiang Fu, Frank Wentz, Cheng-Zhi Zou, Jeffrey Painter, Ivana Cvijanovic, and Céline Bonfils. American Meteorological Society. (5)
- “Coupled model intercomparison project”. Wikipedia. (6)
- “Analysis: How well have climate models projected global warming?” CarbonBrief. Oct 5, 2017. (7)
- “How reliable are climate models?” SkepticalScience.com. (8)
- “Study Confirms Climate Models are Getting Future Warming Projections Right.” Alan Buis. NASA’s Jet Propulsion Laboratory. (9)
- “Evaluating the Performance of Past Climate Model Projections.” Zeke Hausfather Henri F. Drake Tristan Abbott Gavin A. Schmidt Geophysical Research Letters. December 4, 2019. (10)