API Reference

Interpolators

Spline([mindist, damping, force_coords, engine]) Biharmonic spline interpolation using Green’s functions.
SplineCV([mindists, dampings, force_coords, …]) Cross-validated biharmonic spline interpolation.
VectorSpline2D([poisson, mindist, damping, …]) Elastically coupled interpolation of 2-component vector data.
ScipyGridder([method, extra_args]) A scipy.interpolate based gridder for scalar Cartesian data.

Data Processing

BlockReduce(reduction[, spacing, region, …]) Apply a reduction/aggregation operation to the data in blocks/windows.
BlockMean([spacing, region, adjust, …]) Apply a (weighted) mean to the data in blocks/windows.
Trend(degree) Fit a 2D polynomial trend to spatial data.

Composite Estimators

Chain(steps) Chain filtering operations to fit on each subsequent output.
Vector(components) Fit an estimator to each component of multi-component vector data.

Model Selection

train_test_split(coordinates, data[, weights]) Split a dataset into a training and a testing set for cross-validation.
cross_val_score(estimator, coordinates, data) Score an estimator/gridder using cross-validation.

Coordinate Manipulation

grid_coordinates(region[, shape, spacing, …]) Generate the coordinates for each point on a regular grid.
scatter_points(region, size[, random_state, …]) Generate the coordinates for a random scatter of points.
profile_coordinates(point1, point2, size[, …]) Coordinates for a profile along a straight line between two points.
get_region(coordinates) Get the bounding region of the given coordinates.
pad_region(region, pad) Extend the borders of a region by the given amount.
project_region(region, projection) Calculate the bounding box of a region in projected coordinates.
inside(coordinates, region) Determine which points fall inside a given region.
block_split(coordinates[, spacing, adjust, …]) Split a region into blocks and label points according to where they fall.

Utilities

test([doctest, verbose, coverage, figures]) Run the test suite.
maxabs(\*args[, nan]) Calculate the maximum absolute value of the given array(s).
distance_mask(data_coordinates, maxdist[, …]) Mask grid points that are too far from the given data points.
variance_to_weights(variance[, tol, dtype]) Converts data variances to weights for gridding.
grid_to_table(grid) Convert a grid to a table with the values and coordinates of each point.
median_distance(coordinates[, k_nearest, …]) Median distance between the k nearest neighbors of each point.

Input/Output

load_surfer(fname[, dtype]) Read data from a Surfer ASCII grid file.

Datasets

datasets.CheckerBoard([amplitude, region, …]) Generate synthetic data in a checkerboard pattern.
datasets.fetch_baja_bathymetry() Fetch sample bathymetry data from Baja California.
datasets.setup_baja_bathymetry_map(ax[, …]) Setup a Cartopy map for the Baja California bathymetry dataset.
datasets.fetch_california_gps() Fetch sample GPS velocity data from California (the U.S.
datasets.setup_california_gps_map(ax[, …]) Setup a Cartopy map for the California GPS velocity dataset.
datasets.fetch_texas_wind() Fetch sample wind speed and air temperature data for the state of Texas, USA.
datasets.setup_texas_wind_map(ax[, region, …]) Setup a Cartopy map for the Texas wind speed and air temperature dataset.
datasets.fetch_rio_magnetic() Fetch sample total-field magnetic anomaly data from Rio de Janeiro, Brazil.
datasets.setup_rio_magnetic_map(ax[, region]) Setup a Cartopy map for the Rio de Janeiro magnetic anomaly dataset.

Base Classes and Functions

base.BaseGridder Base class for gridders.
base.n_1d_arrays(arrays, n) Get the first n elements from a tuple/list, make sure they are arrays, and ravel.
base.check_fit_input(coordinates, data, weights) Validate the inputs to the fit method of gridders.
base.least_squares(jacobian, data, weights) Solve a weighted least-squares problem with optional damping regularization.