# Numerical Analysis and Scientific Computing Seminar

## Data assimilation by the Ensemble Kalman filter and other particle filters — why are 50 ensemble members enough?

**Speaker:**
Matti Morzfeld, University of Arizona at Tucson

**Location:**
Warren Weaver Hall 1302

**Date:**
Nov. 11, 2016, 10 a.m.

**Synopsis:**

Suppose you have a mathematical model for the weather and that you want to use it to make a forecast. If the model calls for rain but you wake up to sunshine, then you should recalibrate your model to this observation before you make a prediction for the next day. This is an example of Bayesian inference, also called “data assimilation” in geophysics. The ensemble Kalman filter (EnKF) is used routinely and daily in numerical weather prediction (NWP) for solving such a data assimilation problem. Particle filters (PF) are sequential Monte Carlo methods and are in principle applicable to this problem as well. However, PF are never used in operational NWP because a linear analysis shows that the computational requirements of PF scale exponentially with the dimension of the problem. In this talk I will show that a similar analysis applies to EnKF, i.e., “in theory” EnKF’s computational requirements also scale poorly with the dimension. Thus, there is a dramatic contradiction between theory and practice. I will explain that this contradiction arises from different assessments of errors (local vs. global). I will also offer an explanation of why EnKF can work reliably with a very small ensemble (typically 50), and what these findings imply for the applicability of PF to high-dimensional estimation problems.