verde.BlockReduce#
- class verde.BlockReduce(reduction, block_shape=None, block_size=None, region=None, adjust='block_size', center_coordinates=False, drop_coords=True)[source]#
Apply a reduction/aggregation operation to the data in blocks/windows.
Returns the reduced data value for each block along with the associated coordinates, which can be determined through the same reduction applied to the coordinates or as the center of each block.
If a data region to be divided into blocks is not given, it will be the bounding region of the data. When using this class to decimate data before gridding, it’s best to use the desired grid spacing as the block size.
The size of the blocks can be specified by the block_size parameter. Alternatively, the number of blocks in the South-North and West-East directions can be specified using the block_shape parameter.
If the given region is not divisible by the block size, either the region or the size will have to be adjusted. By default, the block size will be rounded to the nearest multiple. Optionally, the East and North boundaries of the region can be adjusted to fit the exact block size given.
Blocks without any data are omitted from the output.
Implements the
filter
method so it can be used withverde.Chain
. Only acts during data fitting and is ignored during prediction.- Parameters:
- reduction
function
A reduction function that takes an array and returns a single value (e.g.,
np.mean
,np.median
, etc).- block_shape
tuple
= (n_north
,n_east
)or
None
The number of blocks in the South-North and West-East directions, respectively.
- block_size
float
,tuple
= (s_north
,s_east
),or
None
The block size in the South-North and West-East directions, respectively. A single value means that the size is equal in both directions.
- region
list
= [W
,E
,S
,N
] The boundaries of a given region in Cartesian or geographic coordinates.
- adjust{‘block_size’, ‘region’}
Whether to adjust the block_size or the region if required. Ignored if block_shape is given instead of block_size. Defaults to adjusting the block size.
- center_coordinatesbool
If True, then the returned coordinates correspond to the center of each block. Otherwise, the coordinates are calculated by applying the same reduction operation to the input coordinates.
- drop_coordsbool
If True, only the reduced
easting
andnorthing
coordinates are returned, dropping any other ones. If False, all coordinates are reduced and returned. Default True.
- reduction
See also
BlockMean
Apply the mean in blocks. Will output weights.
verde.Chain
Apply filter operations successively on data.
Methods
filter
(coordinates, data[, weights])Apply the blocked aggregation to the given data.
Get metadata routing of this object.
get_params
([deep])Get parameters for this estimator.
set_params
(**params)Set the parameters of this estimator.
Method documentation#
- BlockReduce.filter(coordinates, data, weights=None)[source]#
Apply the blocked aggregation to the given data.
Returns the reduced data value for each block along with the associated coordinates, which can be determined through the same reduction applied to the coordinates or as the center of each block.
If weights are given, the reduction function must accept a
weights
keyword argument. The weights are passed in to the reduction but we have no generic way aggregating the weights or reporting uncertainties. For that, look to the specialized classes likeverde.BlockMean
.- Parameters:
- coordinates
tuple
of
arrays
Arrays with the coordinates of each data point. Should be in the following order: (easting, northing, vertical, …). Only easting and northing will be used to create the blocks. If
drop_coords
isFalse
, all other coordinates will be reduced along with the data.- data
array
ortuple
of
arrays
The data values at each point. If you want to reduce more than one data component, pass in multiple arrays as elements of a tuple. All arrays must have the same shape.
- weights
None
orarray
ortuple
of
arrays
If not None, then the weights assigned to each data point. If more than one data component is provided, you must provide a weights array for each data component (if not None).
- coordinates
- Returns:
- blocked_coordinates
tuple
of
arrays
Tuple containing arrays with the coordinates of each block that contains data. If
drop_coords
isTrue
, the tuple will only contain (easting
,northing
). Ifdrop_coords
isFalse
, it will contain (easting
,northing
,vertical
, …).- blocked_data
array
The block reduced data values.
- blocked_coordinates
- BlockReduce.get_metadata_routing()#
Get metadata routing of this object.
Please check User Guide on how the routing mechanism works.
- Returns:
- routing
MetadataRequest
A
MetadataRequest
encapsulating routing information.
- routing
- BlockReduce.get_params(deep=True)#
Get parameters for this estimator.
- BlockReduce.set_params(**params)#
Set the parameters of this estimator.
The method works on simple estimators as well as on nested objects (such as
Pipeline
). The latter have parameters of the form<component>__<parameter>
so that it’s possible to update each component of a nested object.- Parameters:
- **params
dict
Estimator parameters.
- **params
- Returns:
- self
estimator
instance
Estimator instance.
- self