EDM and Predictions

Published by admin on

An earlier post on Empirical Dynamics and Information Flow, https://clarodatascience.com/2018/06/17/dynamical-systems-information-flow-and-causality/ showed a technique to reconstruct attractors in phase space from measurement of a single coordinate of trajectories orbiting the attractor. Briefly, we construct lagged vectors from one coordinate time series, identify the nearest neighbours for a candidate vector, find their corresponding projections on a second coordinate and use those projections (weighted appropriately) to predict the corresponding projection from the candidate vector to the second coordinate. If this prediction converges, with increasing time series length, we infer that the coordinates are coupled in a dynamical system.

Naively, one could imagine that one might use this to predict future evolution of the trajectory. One would take the nearest neighbour lagged vectors, look at their “future” evolution and use those to predict the future of the candidate. But convergence for predictive calculations does not seem so easy to prove. Indeed, equally naively, one might imagine since one only looks at nearest neighbour trajectories over finite time windows (roughly speaking, proportional to the embedding dimension of the reconstructed attractor) and uses convergence as a criterion for dynamical coupling, the spread of predictions of future trajectory would diverge.

But, notwithstanding such caveats, it turns out that it is possible to make skillful predictions of the future development of trajectories as well. Last fall in the Lorenz lecture at the AGU meeting Prof. Krishnamurthy described predictions for the South Asian Monsoon using EDM. This is a system that is notoriously difficult to analyse but Krishnamurthy can now clearly outperform climate forecast models, and finds predictability out to sixty days.

The key seems to be finding robust “slow” modes, if they exist, of at least the prediction timescale. And to be looking at the right thing. The right thing in this case is not a timeseries of pressure or temperature, but rather the waxing and waning of the principal components of variation. If some of these are robust, one can build on them.

So he adds a wrinkle to the “traditional” EDM: He first uses a multichannel singular spectral analysis to find eigenmodes, then uses these mode component time series as a basis for EDM. The first two sentences below describe the “traditional” EDM and I will let him explain the rest:

“The development of the prediction model consists of reconstructing the low-dimensional phase space and finding an appropriate mapping to predict the future states in the phase space. The embedding in time-delay coordinates based on correlation dimension is commonly used for reconstructing the phase space [Packard et al., 1980; Takens, 1981; Abarbanel et al., 1993]. However, for a proper description of the dynamical system, an orthonormal basis is obtained by using singular spectrum analysis (SSA) or MSSA of lagged time series [Elsner and Tsonis, 1992; Sharma et al., 1993; Ghil et al., 2002]. Since SSA and MSSA provide the leading features of the dynamics [Ghil et al., 2002], a lower dimensional eigenspace spanned by suitable SSA/MSSA modes offers a better choice for the reconstruction of the phase space. This method is even more suitable since our aim is to predict MISO obtained from MSSA. The trajectory of a dynamical system visits all regions of the phase space of the attractor and comes arbitrarily close to any previously visited point arbitrarily often [Lorenz, 1963]. If two points in the phase space are very close, it is expected that their trajectories will remain close for a while before diverging. The local dynamics determines how long the trajectories remain close to each other. The time evolution of the neighboring trajectories from the past data can then be used to predict the evolution of the current state to the future time steps, similar in concept to the analogue method used by Lorenz[1969]. This requires identification of the trajectories passing through a sphere of specified radius in the reconstructed phase space, with the current state at the center, and then modeling its dynamical evolution using the trajectories of the nearest neighbors [Sharma, 1995]. We then find the states of the nearest neighbors at the next time step and obtain their mean value as the predicted next state [Ukhorskiy et al., 2004].”

“Predictability at intraseasonal time scale,” V. Krishnamurthy and A. S. Sharma, Geophysical Research Letters, 2017 doi: 10.1002/2017GL074984

Fig 2 from the paper is shown below. The top panel shows the Krishnamurthy results and the bottom panel is made using a climate forecast system.

I show Fig 4a below. This is the correlation between a reconstructed component from the Krishnamurthy (PSRM) method and reality, with the corresponding prediction from a climate forecast model. We see correlation in PSRM remains above 0.6 upto 80 days out, while the results for the forecast model are quite abysmal.

The video of the Lorenz lecture is also very good, but you have first to register (free) at https://meetings.agu.org/meeting/2018-fall-meeting/)

The MSSA method is in “Advanced Spectral Methods For Climatic Time Series,” Ghil et al., Reviews of Geophysics, 40, 1 / March 2002 , doi:10.1029/2000RG000092

sidd


Categories: DataScience