intake_stac.StacCatalog

class intake_stac.StacCatalog(*args, **kwargs)[source]

Intake Catalog represeting a STAC Catalog https://github.com/radiantearth/stac-spec/blob/master/catalog-spec/catalog-spec.md

A Catalog that references a STAC catalog at some URL and constructs an intake catalog from it, with opinionated choices about the drivers that will be used to load the datasets. In general, the drivers are:

  • netcdf

  • rasterio

  • xarray_image

  • textfiles

__init__(stac_obj, **kwargs)

Initialize the catalog.

Parameters
stac_obj: stastac.Thing

A satstac.Thing pointing to a STAC object

kwargsdict, optional

Passed to intake.Catalog.__init__

Methods

__init__(stac_obj, **kwargs)

Initialize the catalog.

close()

Close open resources corresponding to this data source.

configure_new(**kwargs)

Create a new instance of this source with altered arguments

describe()

discover()

Open resource and populate the source attributes.

export(path, **kwargs)

Save this data for sharing with other people

filter(func)

Create a Catalog of a subset of entries based on a condition

force_reload()

Imperative reload data now

from_dict(entries, **kwargs)

Create Catalog from the given set of entries

from_url(url, **kwargs)

Initialize the catalog from a STAC url.

get(**kwargs)

Create a new instance of this source with altered arguments

get_persisted()

items()

Get an iterator over (key, value) tuples for the catalog entries.

persist([ttl])

Save data from this source to local persistent storage

pop(key)

Remove entry from catalog and return it

read()

Load entire dataset into a container and return it

read_chunked()

Return iterator over container fragments of data source

read_partition(i)

Return a part of the data corresponding to i-th partition.

reload()

Reload catalog if sufficient time has passed

save(url[, storage_options])

Output this catalog to a file as YAML

search(text[, depth])

serialize()

Serialize the catalog to yaml.

set_cache_dir(cache_dir)

to_dask()

Return a dask container for this data source

to_spark()

Provide an equivalent data object in Apache Spark

walk([sofar, prefix, depth])

Get all entries in this catalog and sub-catalogs

yaml([with_plugin])

Return YAML representation of this data-source

Attributes

cache_dirs

classname

container

datashape

description

entry

gui

Source GUI, with parameter selection and plotting

has_been_persisted

hvplot

Returns a hvPlot object to provide a high-level plotting API.

is_persisted

kwargs

name

partition_access

plot

Returns a hvPlot object to provide a high-level plotting API.

plots

List custom associated quick-plots

version