Stock photo representing environmental AI

University of Minnesota is part of $25M AI-based climate modeling center

NSF-funded center will leverage big data and machine learning to improve climate projections

University of Minnesota Twin Cities researchers are part of a new $25 million climate modeling center funded by the U.S. National Science Foundation (NSF) called the Learning the Earth with Artificial Intelligence and Physics (LEAP). The new center is one of six new science and technology centers announced by NSF and aims to bring greater precision to climate modeling and encourage societies to prepare for the inevitable disruptions ahead. 

In collaboration with the National Center for Atmospheric Research (NCAR) and NASA’s Goddard Institute for Space Studies (GISS), LEAP will develop the next generation of data-driven physics-based climate models. It will also train a new wave of students fluent in both climate science and working with big datasets and modern machine-learning algorithms. The center’s larger goal is to provide actionable information for societies to adapt to climate change and protect the most vulnerable.

The center is led by a team of researchers from Columbia University. In addition to the University of Minnesota, other partner universities include New York University; University of California, Irvine; and Teachers College, Columbia University.

The University of Minnesota researchers will lead the development of a new generation of machine learning algorithms that are able to leverage scientific knowledge to identify relationships between different components of the global climate system even in presence of limited data.

“These knowledge-guided machine learning techniques are fundamentally more powerful than standard black-box machine learning approaches that often fail to generalize to unseen scenarios”,  said University of Minnesota Regents Professor Vipin Kumar, a William Norris Land Grant Chair in Large-Scale Computing in the Department of Computer Science & Engineering.

Global climate models agree that the planet will continue to warm in the next 40 years. But they disagree on how much, and how severe the impacts will be, from sea-level rise to an increase in floods and drought. Much of the problem comes down to trying to represent the details of complex physical and biological processes — like clouds reflecting sunlight into space or trees absorbing carbon from the air —into the models. Processes interact, and many are poorly understood.

With the help of big data and machine learning, researchers will dig deeper into these processes and update the models with new knowledge to improve climate projections. Researchers will harness existing algorithms to analyze satellite images and other large-scale observational data missing from today’s models. They will also develop new algorithms to take detailed observations and generalize them to broader contexts, discover cause and effect relationships in the data, and find better equations to describe the processes represented in the models. As new knowledge gets woven into the models, researchers will use machine-learning tools to test their predictions.

“Until climate models can offer more precise projections, at the regional level where planning decisions are made, it will be difficult to make the billion-dollar investments needed to adapt,” said Columbia University President Lee C. Bollinger.  

To read more about NSF’s new science and technology centers, visit the NSF news website

Share