Skip to main content Skip to navigation

Event Diary

Show all calendar items

CRiSM Seminar - Daniel Williamson (Exeter) & David van Dyk (Imperial)

- Export as iCalendar
Location: A1.01

David van Dyk (Imperial)
Statistical Learning Challenges in Astronomy and Solar Physics
In recent years, technological advances have dramatically increased the quality and quantity of data available to astronomers. Newly launched or soon-to-be launched space-based telescopes are tailored to data-collection challenges associated with specific scientific goals. These instruments provide massive new surveys resulting in new catalogs containing terabytes of data, high resolution spectrography and imaging across the electromagnetic spectrum, and incredibly detailed movies of dynamic and explosive processes in the solar atmosphere. The spectrum of new instruments is helping scientists make impressive strides in our understanding of the physical universe, but at the same time generating massive data-analytic and data-mining challenges for scientists who study the resulting data. In this talk I will introduce and discuss the statistical learning challenges inherent in data streams that are both massive and complex.

Daniel Williamson (Exeter)
Earth system models and probabilistic Bayesian calibration: a screw meets a hammer?
The design and analysis of computer experiments, now called “Uncertainty Quantification” or “UQ” has been an active area of statistical research for 25 years. One of the most high profile methodologies, that of calibrating a complex computer code using the Bayesian solution to the inverse problem as described by Kennedy and O’Hagan’s seminal paper in 2001, has become something of a default approach to tackling applications in UQ and has over 1200 citations. However, is this always wise? Though the method is well tested and arguably appropriate for many types of model, particularly those for which large amounts of data are readily available and in which the limitations of the underlying mathematical expressions and solvers are well understood, many models, such as those found in climate simulation, go far beyond those successfully studied in terms of non-linearity, run time, output size and complexity of the underlying mathematics. Have we really solved the calibration problem? To what extent is our “off the shelf approach” appropriate for the problems faced in fields such as Earth system modelling? In this talk we will discuss some of the known limitations of the Bayesian calibration framework (and some perhaps unknown) and we explore the extent to which the conditions in which calibration is known to fail are met in climate model problems. We will then present and argue for an alternative approach to the problem and apply it an ocean GCM known as NEMO.

Show all calendar items