A rigorous guide to predicting with confidence when f(x) is unknown.
This book develops a minimax approach to building linear predictors that minimize the worst-case mean square error under Lipschitz-type constraints. It shows how to choose weights and coefficients so predictions stay stable even when the true function is uncertain.
The discussion starts from simple linear models and moves up to higher-order conditions, revealing how regularity assumptions shape the best predictor. It connects theory to practice with ways to implement weighted least squares and intuitive guidance for interpolating and smoothing time series data.
- Understand how to frame prediction as minimizing the maximum possible error across a class of functions.
- Learn how to construct linear predictors from observed data and compute their worst-case risk.
- See why and when weighting observations toward more recent data can be optimal in a minimax sense.
- Gain insight into extending these ideas to higher-order Lipschitz conditions and time-series smoothing.
Ideal for readers who want a solid, math-grounded method for robust prediction in the presence of model uncertainty, including those focused on time-series analysis and statistical forecasting.