← Previous: ODE Solvers | Back to Index | Next: Linear Regression →
Interpolation is the process of estimating values between known data points. Given a set of
The Numerics library provides interpolation methods for estimating values between known data points, essential for data analysis, curve fitting, and function approximation.
| Method | Class | Continuity | Error Order | Use Case |
|---|---|---|---|---|
| Linear | Linear |
Fast, simple, no overshooting | ||
| Cubic Spline | CubicSpline |
Smooth curves, physical phenomena | ||
| Polynomial | Polynomial |
Varies | Arbitrary order fitting | |
| Bilinear | Bilinear |
2D interpolation on grids |
Linear interpolation connects adjacent data points with straight line segments. For a query point
This can be rewritten in terms of the normalized coordinate
Error bound: For a function
where
using Numerics.Data;
double[] xData = { 0, 1, 2, 3, 4, 5 };
double[] yData = { 1, 3, 2, 5, 4, 6 };
var linear = new Linear(xData, yData);
// Interpolate at new points
double y = linear.Interpolate(2.5);
Console.WriteLine($"y(2.5) = {y:F2}");
// Multiple points
double[] xNew = { 0.5, 1.5, 2.5, 3.5 };
double[] yNew = linear.Interpolate(xNew);
Console.WriteLine("Interpolated values:");
for (int i = 0; i < xNew.Length; i++)
{
Console.WriteLine($" y({xNew[i]}) = {yNew[i]:F2}");
}Properties:
- Fast:
$O(\log n)$ per evaluation (bisection search for interval) -
$C^0$ continuous (values continuous, derivatives not) - No overshooting — interpolant stays within bracket values
- Good for piecewise linear trends or noisy data
The Linear class also supports coordinate transforms via XTransform and YTransform properties, enabling log-linear or probability-scale interpolation. Available transforms are Transform.None (default), Transform.Logarithmic (Transform.NormalZ (standard normal quantile).
A cubic spline constructs a piecewise cubic polynomial
On each subinterval
where
This system of
Error bound: For a function
The
using Numerics.Data;
double[] xData = { 0, 1, 2, 3, 4, 5 };
double[] yData = { 1, 3, 2, 5, 4, 6 };
// Natural cubic spline (zero second derivatives at endpoints)
var spline = new CubicSpline(xData, yData);
// Interpolate at a single point
double y = spline.Interpolate(2.5);
Console.WriteLine($"Spline y(2.5) = {y:F2}");
// Interpolate at multiple points
double[] xNew = { 0.5, 1.5, 2.5, 3.5 };
double[] yNew = spline.Interpolate(xNew);
for (int i = 0; i < xNew.Length; i++)
{
Console.WriteLine($" y({xNew[i]}) = {yNew[i]:F2}");
}Properties:
-
$C^2$ continuous (smooth second derivative) - Unique solution through all points
- Natural boundary conditions (
$S''=0$ at endpoints) - May overshoot between data points, especially with oscillatory data
- Excellent for smooth physical phenomena
Given
The Numerics library uses Neville's method [1] to evaluate the interpolating polynomial. Rather than computing coefficients explicitly, Neville's method builds a tableau of progressively higher-degree polynomial approximations through recursive divided differences:
where
The Polynomial class fits a polynomial of specified order using a local window of order + 1 points centered near the query point, which helps mitigate oscillation issues:
using Numerics.Data;
double[] xData = { 0, 1, 2, 3, 4 };
double[] yData = { 1, 3, 2, 5, 4 };
// Fit 3rd order polynomial (order is the first parameter)
var poly = new Polynomial(3, xData, yData);
// Interpolate
double y = poly.Interpolate(2.5);
Console.WriteLine($"Polynomial y(2.5) = {y:F2}");
// The error estimate from the most recent interpolation
Console.WriteLine($"Error estimate: {poly.Error:F6}");A critical limitation of polynomial interpolation is Runge's phenomenon: as the polynomial degree increases, the interpolation error can grow dramatically near the edges of the interval, even for smooth functions. Consider the Runge function:
With
Mitigation strategies:
- Use cubic splines instead of high-degree polynomials — splines keep each piece low-degree while maintaining global smoothness
- Use Chebyshev nodes (non-uniform spacing clustered near endpoints) if you can control where data is collected
- Keep polynomial degree low (
$\leq 5$ ) and use a local window, as thePolynomialclass does
Best practice: Use splines instead of high-degree polynomials.
Bilinear interpolation extends linear interpolation to two-dimensional gridded data. For a query point
where
Error: For a function with bounded mixed partial derivative,
using Numerics.Data;
// Grid coordinates
double[] xGrid = { 0, 1, 2 };
double[] yGrid = { 0, 1, 2 };
// Grid values z[i,j] = z(xGrid[i], yGrid[j])
double[,] zGrid = {
{ 1, 2, 3 },
{ 2, 3, 4 },
{ 3, 4, 5 }
};
var bilinear = new Bilinear(xGrid, yGrid, zGrid);
// Interpolate at arbitrary point
double z = bilinear.Interpolate(0.5, 0.5);
Console.WriteLine($"z(0.5, 0.5) = {z:F2}");
// Multiple points (loop over individual point pairs)
double[] xNew = { 0.5, 1.5 };
double[] yNew = { 0.5, 1.5 };
for (int i = 0; i < xNew.Length; i++)
{
double zi = bilinear.Interpolate(xNew[i], yNew[i]);
Console.WriteLine($"z({xNew[i]}, {yNew[i]}) = {zi:F2}");
}Like the Linear class, Bilinear supports coordinate transforms (X1Transform, X2Transform, YTransform) for log-linear or probability-scale interpolation in 2D.
Applications:
- Image resizing
- Terrain elevation maps
- Temperature/pressure fields
- Geographic data
// Measured stage-discharge pairs
double[] stage = { 5.0, 5.5, 6.0, 6.5, 7.0, 7.5, 8.0 };
double[] discharge = { 1000, 1500, 2200, 3100, 4200, 5500, 7000 };
// Create spline interpolator
var ratingCurve = new CubicSpline(stage, discharge);
// Interpolate discharge for observed stage
double observedStage = 6.3;
double estimatedQ = ratingCurve.Interpolate(observedStage);
Console.WriteLine($"Rating Curve Interpolation:");
Console.WriteLine($"Stage: {observedStage:F1} ft → Discharge: {estimatedQ:F0} cfs");
// Extrapolation warning
if (observedStage < stage.Min() || observedStage > stage.Max())
{
Console.WriteLine("Warning: Extrapolating beyond data range");
}// Time series with gaps
var dates = new[] { 1.0, 2.0, 3.0, /* gap */ 6.0, 7.0, 8.0 };
var values = new[] { 10.5, 11.2, 10.8, /* gap */ 12.1, 12.5, 11.9 };
var interpolator = new CubicSpline(dates, values);
// Fill gaps
var missingDates = new[] { 4.0, 5.0 };
foreach (var t in missingDates)
{
double filled = interpolator.Interpolate(t);
Console.WriteLine($"Day {t}: {filled:F2} (interpolated)");
}double[] x = { 0, 1, 2, 3, 4 };
double[] y = { 0, 1, 0, 1, 0 }; // Oscillating data
var linear = new Linear(x, y);
var spline = new CubicSpline(x, y);
double testPoint = 1.5;
double yLinear = linear.Interpolate(testPoint);
double ySpline = spline.Interpolate(testPoint);
Console.WriteLine($"At x = {testPoint}:");
Console.WriteLine($" Linear: {yLinear:F3}");
Console.WriteLine($" Spline: {ySpline:F3}");
Console.WriteLine("\nLinear connects with straight line");
Console.WriteLine("Spline creates smooth curve (may overshoot)");Note the spline may produce values outside the range of the bracketing data points due to the smoothness constraint. For data that must remain monotonic, this overshoot can be problematic — in such cases, consider using linear interpolation instead.
// Elevation data at grid points
double[] eastings = { 0, 100, 200 }; // meters
double[] northings = { 0, 100, 200 }; // meters
double[,] elevations = {
{ 100, 105, 110 },
{ 102, 108, 115 },
{ 104, 112, 120 }
};
var terrain = new Bilinear(eastings, northings, elevations);
// Interpolate elevation at arbitrary location
double x = 150, y = 150;
double z = terrain.Interpolate(x, y);
Console.WriteLine($"Terrain elevation at ({x}, {y}): {z:F1} m");
// Sample elevations along a transect
Console.WriteLine("\nElevation transect from (2,2) to (8,8):");
for (double t = 0; t <= 1; t += 0.2)
{
double xi = 2 + 6 * t;
double yi = 2 + 6 * t;
double zi = terrain.Interpolate(xi, yi);
Console.WriteLine($" ({xi:F1}, {yi:F1}): {zi:F1} m");
}- Data spacing: Interpolation works best with reasonably uniform spacing. Highly irregular spacing can lead to large errors in some subintervals
-
Extrapolation: Avoid extrapolating beyond data range — splines and polynomials are especially unreliable outside the data bounds. The
Linearclass returns boundary values; theBilinearclass falls back to 1D interpolation at edges -
Smoothness: Use splines for smooth physical phenomena (
$C^2$ continuity), linear for piecewise trends or noisy data ($C^0$ continuity with no overshooting) - Outliers: Check for data errors before interpolating — splines will faithfully pass through outliers
- Monotonicity: Cubic splines do not preserve monotonicity of the data. If your data should be monotonic (e.g., a CDF or rating curve), verify the interpolant doesn't violate this
- Periodic data: Consider Fourier or trigonometric interpolation for periodic signals
| Data Characteristics | Recommended Method | Why |
|---|---|---|
| Few points, simple trend | Linear | No overshooting, |
| Smooth physical process | Cubic Spline |
|
| Need derivatives | Cubic Spline | Spline derivatives are well-defined |
| Noisy data | Linear | Splines amplify noise through curvature matching |
| 2D regular grid | Bilinear | Direct extension to 2D |
| Piecewise constant | Nearest neighbor | Preserves step structure |
| Exact polynomial | Polynomial (low degree) | Neville's method with error estimate |
-
Runge's phenomenon: High-degree polynomials (
$>5$ ) oscillate wildly near interval boundaries — use splines instead - Extrapolation: Results outside the data range are unreliable for all methods
- Natural boundary conditions: The zero-curvature constraint at endpoints can cause the spline to flatten near the boundaries, which may be physically inappropriate
- Monotonicity violation: Cubic splines can introduce local extrema between data points, violating monotonicity
- Edge effects: All methods lose accuracy near the boundaries of the data range
[1] W. H. Press, S. A. Teukolsky, W. T. Vetterling and B. P. Flannery, Numerical Recipes: The Art of Scientific Computing, 3rd ed., Cambridge, UK: Cambridge University Press, 2007.
[2] C. de Boor, A Practical Guide to Splines, Rev. ed., New York: Springer, 2001.
[3] R. L. Burden and J. D. Faires, Numerical Analysis, 9th ed., Boston: Brooks/Cole, 2010.
← Previous: ODE Solvers | Back to Index | Next: Linear Regression →