Climate Change Research Centre,
University of New South Wales

Professor Steven Sherwood studies atmospheric humidity, convective systems and dynamics; interactions between clouds, air circulation and climate; remote sensing of storms, and observed warming trends. He holds a bachelor’s degree in physics from the Massachusetts Institute of Technology (1987), the PhD in Oceanography from the Scripps Institute of Oceanography, University of California (1995), and held postdoctoral positions at VUW in New Zealand and NASA, before rising to rank of Professor at Yale University and then moving to UNSW Sydney in 2009. He is a past Director and current Deputy Director of the UNSW Climate Change Research Centre, and a chief investigator in the Australian Research Council (ARC) Centre of Excellence for Climate Extremes. Prof Sherwood has contributed to national and international climate change reports including as a Lead Author of the chapter on Clouds and Aerosols in the IPCC 5th Assessment Report from Working Group I, and has won several awards including the NSF CAREER award, AMS Meisinger Award, finalist for Eureka Prize in Scientific Discovery, and selection in 2015 as an ARC Laureate Fellow. He is currently co-chair of the World Climate Research Programme’s Safe Landing Climates Lighthouse Activity, which aims to promote scientific activity that will help quantify risks and identify safe pathways of future climate.

AS Distinguished Lecture | 31 July (Mon) 08:15 AM – 10:00 AM | Level 3 MR331

How Can We Understand Small-Scale Atmospheric Processes Better?

Abstract: Atmospheric cloud and convective processes continue to be represented in most global atmosphere models by parameterizations, which are based on simple concepts such as an entraining plume to idealise what happens in a complex fluid flow. It is clear that such parameterisations do not perform as well as we would like. Simply tuning model parameters does not lead to adequate performance, evidently because of structural problems for example the failure of a plume to consistently capture what is going on no matter what parameters are used. Very-high-resolution models are an instructive and exciting new approach, but will not solve our problems in the foreseeable future since many processes such as boundary-layer turbulence and cloud microphysics require resolutions that are unattainable in large domains. GCRMs consequently continue to have biases, and are challenging and expensive to work with. In this talk I will argue that there is likely scope for parameterisations to be significantly improved by recognising and correcting oversimplifications or errors in the physics of their conceptual formulation, particularly the lack of internal “memory” in current convection schemes which typically assume that convective processes are in some kind of equilibrium with the larger-scale boundary conditions. I will then discuss a variety of research approaches that are being undertaken to span the huge range of scales involved in the atmosphere. In essence these involve carefully targeted, high-resolution (LES) simulations designed to expose structural deficiencies in our existing concepts and parameterisations, rather than brute-force high resolution over the entire domain of ultimate interest. One strategy is to conduct the small-domain simulations under idealised conditions so as to develop a set of reference responses against which to test physical schemes in ways that are relatively relatable to their assumptions, compared to more messy real-world situations. A second strategy is to use machine learning to develop surrogates for convective processes. The problem of scaling small-domain LES simulations up to large scales can be thought of as a “transfer learning” problem in AI. Machine-learned surrogates can also be used to test structural assumptions by quickly emulating them in a way that can be incorporated into a global model, and at UNSW we have developed a version of the NCAR CAM model that makes this easy (although developing the surrogates themselves remains difficult). Although the core strategy is to learn from high-resolution models, these strategies could also be used to make better use of time series observations from local field sites, and they could rapidly connect new scheme assumptions to large-scale data via the GCM. Although not without its challenges, this approach could be the basis of a new era of model physics evaluation where it is structural assumptions that are varied and tested, rather than just scheme parameters, leading to a deeper and more satisfying understanding and a more pragmatic suite of atmospheric models.