DOE’s maverick climate model is about to get its first test
The world’s growing collection of climate models has a high-profile new entry. Last week, after nearly 4 years of work, the U.S. Department of Energy (DOE) released computer code and initial results from an ambitious effort to simulate the Earth system. The new model is tailored to run on future supercomputers and designed to forecast not just how climate will change, but also how those changes might stress energy infrastructure.
Results from an upcoming comparison of global models may show how well the new entrant works. But so far it is getting a mixed reception, with some questioning the need for another model and others saying the $80 million effort has yet to improve predictions of the future climate. Even the project’s chief scientist, Ruby Leung of the Pacific Northwest National Laboratory (PNNL) in Richland, Washington, acknowledges that the model is not yet a leader. “We really don’t expect that our model will be wowing the world,” she says.
Since the 1960s, climate modelers have used computers to build virtual globes. They break the atmosphere and ocean into thousands of boxes and assign weather conditions to each one. The toy worlds then evolve through simulated centuries, following the laws of physics. Historically, DOE’s major role in climate modeling was contributing to the Community Earth System Model (CESM), an effort based at the National Center for Atmospheric Research (NCAR) in Boulder, Colorado. But in July 2014, DOE launched its Accelerated Climate Model for Energy. The goal was to predict how storms and rising seas could affect power plants, dams, and other energy infrastructure, and to focus on regions such as North America or the Arctic. DOE officials also wanted a model that could run on a generation of megapowerful “exascale” computers expected to turn on around 2021.
The project pulled in researchers from eight DOE national labs. It began as a carbon copy of the CESM and retains similar atmosphere and land models, but includes new ocean, sea-ice, river, and soil biochemistry simulations. The DOE team doubled the number of vertical layers, extended the atmosphere higher, and adopted a number-crunching method that is computationally intensive but may be easier to break into chunks and run in parallel on the anticipated exascale machines. “For them, it makes a lot of sense to go in that direction,” says Richard Neale, a climate scientist at NCAR.
In 2017, after President Donald Trump took office and pulled the nation out of the Paris climate accords, DOE dropped “climate” from the project name. The new name, the Energy Exascale Earth System Model (E3SM), better reflects the model’s focus on the entire Earth system, says project leader David Bader of Lawrence Livermore National Laboratory in California.
The E3SM’s first results highlight its potential; they include model runs with ultrasharp, 25-kilometer-wide grid cells—fine enough to simulate small-scale features such as ocean eddies and mountain snow packs. But this sharp picture is still too coarse to resolve individual clouds and atmospheric convection, major factors limiting models’ precision. And some scientists doubt it will improve forecasts. The last intercomparison effort, which ended in 2014, included 26 modeling groups—nine more than the previous round—yet yielded collective predictions that were no more precise. “Just having more models—I don’t think there’s any evidence that that’s key to advancing the field,” says Bjorn Stevens, a climate scientist at the Max Planck Institute for Meteorology in Hamburg, Germany, and co-leader of the new intercomparison, code-named CMIP6.
Gavin Schmidt, who heads NASA’s Goddard Institute for Space Studies in New York City, which also produces a global climate model, questions the new model’s rationale, given that DOE’s exascale computers do not yet exist. “No one knows what these machines will even look like, so it’s hard to build models for them ahead of time,” he wrote on Twitter. And the computational intensity of the E3SM has drawbacks, says Hansi Singh, a PNNL climate scientist who uses the CESM for her research. The sheer number of calculations needed to get a result with the E3SM would overwhelm most university clusters, limiting outside scientists’ ability to use it.
One preliminary result, on the climate’s sensitivity to carbon dioxide (CO2), will “raise some eyebrows,” Bader says. Most models estimate that, for a doubling of CO2 above preindustrial levels, average global temperatures will rise between 1.5°C and 4.5°C. The E3SM predicts a strikingly high rise of 5.2°C, which Leung suspects is due to the way the model handles aerosols and clouds. And like many models, the E3SM produces two bands of rainfall in the tropics, rather than the one seen in nature near the equator.
The first test of the E3SM will be its performance in CMIP6. Nearly three dozen modeling groups, including newcomers from South Korea, India, Brazil, and South Africa, are expected to submit results to the intercomparison between now and 2020. Each group will devote thousands of computer-hours to standard scenarios, such as simulating the impact of a 1% per year CO2 increase and an abrupt quadrupling of it.
But given the plodding rate of improvement since previous intercomparisons, few are expecting the E3SM or any other model to yield revolutionary insights. Stevens hopes to revise the exercise to encourage innovations, such as modeling the climate at the 1-kilometer resolution needed to make out individual clouds, or campaigns to gather new kinds of data. “The whole premise of CMIP is trying to get everyone to do the same thing,” he says. “Everyone knows that breakthroughs come from getting people to do different things.”