# Smoothing spline

{{ safesubst:#invoke:Unsubst||$N=Use dmy dates |date=__DATE__ |$B= }} The smoothing spline is a method of smoothing (fitting a smooth curve to a set of noisy observations) using a spline function.

## Definition

Let $(x_{i},Y_{i});x_{1} be a sequence of observations, modeled by the relation $Y_{i}=\mu (x_{i})$ . The smoothing spline estimate ${\hat {\mu }}$ of the function $\mu$ is defined to be the minimizer (over the class of twice differentiable functions) of

$\sum _{i=1}^{n}(Y_{i}-{\hat {\mu }}(x_{i}))^{2}+\lambda \int _{x_{1}}^{x_{n}}{\hat {\mu }}''(x)^{2}\,dx.$ Remarks:

1. $\lambda \geq 0$ is a smoothing parameter, controlling the trade-off between fidelity to the data and roughness of the function estimate.
2. The integral is evaluated over the range of the $x_{i}$ .
3. As $\lambda \to 0$ (no smoothing), the smoothing spline converges to the interpolating spline.
4. As $\lambda \to \infty$ (infinite smoothing), the roughness penalty becomes paramount and the estimate converges to a linear least squares estimate.
5. The roughness penalty based on the second derivative is the most common in modern statistics literature, although the method can easily be adapted to penalties based on other derivatives.
6. In early literature, with equally-spaced $x_{i}$ , second or third-order differences were used in the penalty, rather than derivatives.
7. When the sum-of-squares term is replaced by a log-likelihood, the resulting estimate is termed penalized likelihood. The smoothing spline is the special case of penalized likelihood resulting from a Gaussian likelihood.

## Derivation of the smoothing spline

It is useful to think of fitting a smoothing spline in two steps:

Now, treat the second step first.

Given the vector ${\hat {m}}=({\hat {\mu }}(x_{1}),\ldots ,{\hat {\mu }}(x_{n}))^{T}$ of fitted values, the sum-of-squares part of the spline criterion is fixed. It remains only to minimize $\int {\hat {\mu }}''(x)^{2}\,dx$ , and the minimizer is a natural cubic spline that interpolates the points $(x_{i},{\hat {\mu }}(x_{i}))$ . This interpolating spline is a linear operator, and can be written in the form

${\hat {\mu }}(x)=\sum _{i=1}^{n}{\hat {\mu }}(x_{i})f_{i}(x)$ where $f_{i}(x)$ are a set of spline basis functions. As a result, the roughness penalty has the form

$\int {\hat {\mu }}''(x)^{2}dx={\hat {m}}^{T}A{\hat {m}}.$ Now back to the first step. The penalized sum-of-squares can be written as

$\|Y-{\hat {m}}\|^{2}+\lambda {\hat {m}}^{T}A{\hat {m}},$ ${\hat {m}}=(I+\lambda A)^{-1}Y.$ ## De Boor's approach

De Boor's approach exploits the same idea, of finding a balance between having a smooth curve and being close to the given data.

## Creating a multidimensional spline

Given the constraint from the definition formula $x_{1} we can conclude that the algorithm doesn't work for all sets of data. If we plan to use this algorithm for random points in a multidimensional space we need to find a solution to give as input to the algorithm sets of data where these constraints are met. A solution for this is to introduce a parameter so that the input data would be represented as single-valued functions depending on that parameter; after this the smoothing will be performed for each function. In a bidimensional space a solution would be to parametrize $x$ and $y$ so that they would become $x(t)$ and $y(t)$ where $t_{1} . A convenient solution for $t$ is the cumulating distance $t_{i+1}=t_{i}+{\sqrt {(x_{i+1}-x_{i})^{2}+(y_{i+1}-y_{i})^{2}}}$ where $t_{1}=0$ .

A more detailed analysis on parametrization is done by E.T.Y Lee.

## Related methods

Smoothing splines are related to, but distinct from:

• Regression splines. In this method, the data is fitted to a set of spline basis functions with a reduced set of knots, typically by least squares. No roughness penalty is used.
• Penalized Splines. This combines the reduced knots of regression splines, with the roughness penalty of smoothing splines.
• Elastic maps method for manifold learning. This method combines the least squares penalty for approximation error with the bending and stretching penalty of the approximating manifold and uses the coarse discretization of the optimization problem.

## Source code

Source code for spline smoothing can be found in the examples from Carl de Boor's book A Practical Guide to Splines. The examples are in Fortran programming language. The updated sources are available also on Carl de Boor's official site .