The time variability of many natural and social phenomena is not well described by standard methods of data analysis. However, nonlinear time series analysis uses chaos theory and nonlinear dynamics to understand seemingly unpredictable behavior. The results are applied to real data from physics, bi
Nonlinear Time Series Analysis
โ Scribed by Holger Kantz, Thomas Schreiber
- Publisher
- Cambridge University Press
- Year
- 2003
- Tongue
- English
- Leaves
- 387
- Edition
- 2
- Category
- Library
No coin nor oath required. For personal study only.
โฆ Synopsis
The paradigm of deterministic chaos has influenced thinking in many fields of science. Chaotic systems show rich and surprising mathematical structures. In the applied sciences, deterministic chaos provides a striking explanation for irregular behaviour and anomalies in systems which do not seem to be inherently stochastic. The most direct link between chaos theory and the real world is the analysis of time series from real systems in terms of nonlinear dynamics. Experimental technique and data analysis have seen such dramatic progress that, by now, most fundamental properties of nonlinear dynamical systems have been observed in the laboratory. Great efforts are being made to exploit ideas from chaos theory wherever the data displays more structure than can be captured by traditional methods. Problems of this kind are typical in biology and physiology but also in geophysics, economics, and many other sciences.
Greatly updated edition of a book that sold more than 4000 copies and had terrific reviews
Unique in its use of real-world examples
Broad scope of applications across the sciences and social sciences
โฆ Table of Contents
Preface to the first edition
Preface to the second edition
Acknowledgements
Part I Basic topics
Chapter 1 Introduction: why nonlinear methods?
Further reading
Chapter 2 Linear tools and general considerations
2.1 Stationarity and sampling
2.2 Testing for stationarity
2.3 Linear correlations and the power spectrum
2.3.1 Stationarity and the low-frequency component in the power spectrum
2.4 Linear filters
2.5 Linear predictions
Further reading
Exercises
Chapter 3 Phase space methods
3.1 Determinism: uniqueness in phase space
3.2 Delay reconstruction
3.3 Finding a good embedding
3.3.1 False neighbours
3.3.2 The time lag
3.4 Visual inspection of data
3.5 Poincare surface of section
3.6 Recurrence plots
Further reading
Exercises
Chapter 4 Determinism and predictability
4.1 Sources of predictability
4.2 Simple nonlinear prediction algorithm
4.3 Verification of successful prediction
4.4 Cross-prediction errors: probing stationarity
4.5 Simple nonlinear noise reduction
Further reading
Excercises
Chapter 5 Instability: Lyapunov exponents
5.1 Sensitive dependence on initial conditions
5.2 Exponential divergence
5.3 Measuring the maximal exponent from data
Further reading
Exercises
Chapter 6 Self -similarity: dimensions
6.1 Attractor geometry and fractals
6.2 Correlation dimension
6.3 Correlation sum from a time series
6.4 Interpretation and pitfalls
6.5 Temporal correlations, non-stationarity; and space time separation plots
6.6 Practical considerations
6.7 A useful application: determination of the noise level using the correlation integral
6.8 Multi-scale or self-similar signals
6.8.1 Scaling laws
6.8.2 Detrendedfluctuation analysis
Further reading
Exercises
Chapter 7 Using nonlinear methods when determinism is weak
7.1 Testing for nonlinearity with surrogate data
7.1.1 The null hypothesis
7.1.2 How to make surrogate data sets
7.1.3 Which statistics to use
7.1.4 What can go wrong
7.1.5 What we have learned
7.2 Nonlinear statistics for system discrimination
7.3 Extracting qualitative information from a time series
Further reading
Exercises
Chapter 8 Selected nonlinear phenomena
8.1 Robustness and limit cycles
8.2 Coexistence of attractors
8.3 Transients
8.4 Intermittency
8.5 Structural stability
8.6 Bifurcations
8.7 Quasi-periodicity
Further reading
Part II Advanced topics
Chapter 9 Advanced embedding methods
9.1 Embedding theorems
9.1.1 Whitney's embedding theorem
9.1.2 Takens's delay embedding theorem
9.2 The time lag
9.3 Filtered delay embeddings
9.3.1 Derivative coordinates
9.3.2 Principal component analysis
9.4 Fluctuating time intervals
9.5 Multichannel measurements
9.5.1 Equivalent variables at different positions
9.5.2 Variables with different physical meanings
9.5.3 Distributed systems
9.6 Embedding of interspike intervals
9.7 High dimensional chaos and the limitations of the time delay embedding
9.8 Embedding for systems with time delayed feedback
Further reading
Exercises
Chapter 10 Chaotic data and noise
10.1 Measurement noise and dynamical noise
10.2 Effects of noise
10.3 Nonlinear noise reduction
10.3.1 Noise reduction by gradient descent
10.3.2 Local projective noise reduction
10.3.3 Implementation of locally projective noise reduction
10.3.4 How much noise is taken out?
10.3.5 Consistency tests
10.4 An application: foetal ECG extraction
Further reading
Exercises
Chapter 11 More about invariant quantities
11.1 Ergodicity and strange attractors
11.2 Lyapunov exponents II
11.2.1 The spectrum of Lyapunov exponents and invariant manifolds
11.2.2 Flows versus maps
11.2.3 Tangent space method
11.2.4 Spurious exponents
11.2.5 Almm;t two dimensional flow ...
11.3 Dimensions II
11.3.1 Generalised dimensions, multi-fractals
11.3.2 Information dimension from a time series
11.4 Entropies
11.4.1 Chaos and the .flow of information
11.4.2 Entropies of a static distribution
11.4.3 The Kolmogorov-Sinai entropy
11.4.4 TheE-entropy per unit time
11.4.5 Entropies from time series data
11.5 How things are related
11.5.1 Pesin's identity
11.5.2 Kaplan-Yorke conjecture
Further reading
Exercises
Chapter 12 Modelling and forecasting
12.1 Linear stochastic models and filters
12.1.1 Linear filters
12.1.2 Nonlinear filters
12.2 Deterministic dynamics
12.3 Local methods in phase space
12.3.1 Almost model free methods
12.3.2 Local linear fits
12.4 Global nonlinear models
12.4.1 Polynomials
12.4.2 Radial basis functions
12.4.3 Neural networks
12.4.4 What to do in practice
12.5 Improved cost functions
12.5.1 Overjitting and model costs
12.5.2 The e"ors-in-variables problem
12.5.3 Modelling versus prediction
12.6 Model verification
12.7 Nonlinear stochastic processes from data
12.7.1 Fokker-Planck equations from data
12.7.2 Markov chains in embedding space
12.7.3 No embedding theorem for Markov chains
12. 7.4 Predictions for Markov chain data
12. 7.5 Modelling Markov chain data
12. 7.6 Choosing embedding parameters for Markov chains
12. 7. 7 Application: prediction of surface wind velocities
12.8 Predicting prediction errors
12.8.1 Predictability map
12.8.2 Individual error prediction
12.9 Multi-step predictions versus iterated one-step predictions
Further reading
Exercises
Chapter 13 Non-stationary signals
13.1 Detecting non-stationarity
13.1.1 Making non-stationary data stationary
13.2 Over-embedding
13.2.1 Deterministic systems with parameter drift
13.2.2 Markov chain with parameter drift
13.2.3 Data analysis in over-embedding spaces
13.2.4 Application: noise reduction for human voice
13.3 Parameter spaces from data
Exercises
Chapter 14 Coupling and synchronisation of nonlinear systems
14.1 Measures for interdependence
14.2 Transfer entropy
14.3 Synchronisation
Further reading
Exercises
Chapter 15 Chaos control
15.1 Unstable periodic orbits and their invariant manifolds
15.1.1 Locating periodic orbits
15.1.2 Stable/unstable manifolds from data
15.2 OGY-control and derivates
15.3 Variants of OGY-control
15.4 Delayed feedback
15.5 Tracking
15.6 Related aspects
Further reading
Exercises
Appendix A Using the TISEAN programs
A.1 Information relevant to most of the routines
A.1.1 Efficient neighbour searching
A.1.2 Re-occurring command options
A.1.2.2 The help option
A.1.2.2 Input data
A.1.2.2 Embedding space
A.1.2.2 Defining neighbourhoods
A.1.2.2 Output data
A.2 Second-order statistics and linear models
A.3 Phase space tools
A.4 Prediction and modelling
A.4.1 Locally constant predictor
A.4.2 Locally linear prediction
A.4.3 Global nonlinear models
A.5 Lyapunov exponents
A.6 Dimensions and entropies
A.6.1 The correlation sum
A.6.2 Information dimension, fixed mass algorithm
A.6.3 Entropies
A.7 Surrogate data and test statistics
A.8 Noise reduction
A.9 Finding unstable periodic orbits
A.10 Multivariate data
Appendix B Description of the experimental data sets
B.1 Lorenz-like chaos in an NH3 laser
B.2 Chaos in a periodically modulated NMR laser
B.3 Vibrating string
B.4 Taylor-Couette flow
B.5 Multichannel physiological data
B.6 Heart rate during atrial fibrillation
B.7 Human electrocardiogram (ECG)
B.8 Phonation data
B.9 Postural control data
B.10 Autonomous C02 laser with feedback
B.11 Nonlinear electric resonance circuit
B.12 Frequency doubling solid state laser
B.13 Surface wind velocities
References
Index
๐ SIMILAR VOLUMES
A comprehensive resource that draws a balance between theory and applications of nonlinear time series analysis Nonlinear Time Series Analysis offers an important guide to both parametric and nonparametric methods, nonlinear state-space models, and Bayesian as well as classical approaches to nonline
A comprehensive resource that draws a balance between theory and applications of nonlinear time series analysis Nonlinear Time Series Analysis offers an important guide to both parametric and nonparametric methods, nonlinear state-space models, and Bayesian as well as classical approaches to nonline
<span>The time variability of many natural and social phenomena is not well described by standard methods of data analysis. However, nonlinear time series analysis uses chaos theory and nonlinear dynamics to understand seemingly unpredictable behavior. The results are applied to real data from physi
<em>Nonlinear Time Series Analysis with R</em> provides a practical guide to emerging empirical techniques allowing practitioners to diagnose whether highly fluctuating and random appearing data are most likely driven by random or deterministic dynamic forces. It joins the chorus of voices recommend