Open Data Cube Core provides an integrated gridded data analysis environment for decades of analysis ready earth observation satellite and related data from multiple satellite and other acquisition systems.
See the user guide for installation & usage of the datacube, and for documentation of the API.
Join our Slack if you need help setting up or using Data Cube Core.
- PostgreSQL 9.5+
- Python Python 3.5+
Clone:
git clone https://github.com/opendatacube/datacube-core.git
Install the native libraries for GDAL & NetCDF4.
- This depends on your OS.
- Eg.
yum install gdal
Install Python dependencies:
python setup.py develop
Note that the versions must match between GDAL's Python bindings and the native GDAL library. If you receive a gdal error when installing dependencies, you may need to install a specific version first:
eg.
pip install gdal==2.0.1
Run unit tests + PyLint
./check-code.sh
(this script approximates what is run by Travis. You can alternatively run
py.test
yourself)(or) Run all tests, including integration tests.
./check-code.sh integration_tests
- Assumes a password-less Postgres database running on localhost called
agdcintegration
- Otherwise copy
integration_tests/agdcintegration.conf
to~/.datacube_integration.conf
and edit to customise.
Docker for Open Data Cube is in the early stages of development, and more documentation and examples of how to use it will be forthcoming soon. For now, you can build and run this Docker image from this repository as documented below.
There are a number of environment variables in use that can be used to configure the OpenDataCube. Some of these are built into the application itself, and others are specific to Docker, and will be used to create a configuration file when the container is launched.
You can build the image with a command like this:
docker build --tag opendatacube:local .
And it can then be run with this command:
docker run --rm opendatacube:local
If you don't need to build (and you shouldn't) then you can run it from a pre-built image with:
docker run --rm opendatacube/datacube-core
An example of starting a container with environment variables is as follows:
docker run \
--rm \
-e DATACUBE_CONFIG_PATH=/opt/custom-config.conf \
-e DB_DATABASE=mycube \
-e DB_HOSTNAME=localhost \
-e DB_USERNAME=postgres \
-e DB_PASSWORD=secretpassword \
-e DB_PORT=5432 \
opendatacube/datacube-core
Additionally, you can run an Open Data Cube Docker container along with Postgres using the Docker Compose file.
For example, you can run docker-compose up
and it will start up the Postgres server and Open Data Cube next to it.
To run commands in ODC, you can use docker-compose run odc datacube -v system init
or docker-compose run odc datacube --version
.
Most of the below environment variables should be self explanatory, and none are required (although it is recommended that you set them).
DATACUBE_CONFIG_PATH
- the path for the config file for writing (also used by ODC for reading)DB_DATABASE
- the name of the postgres databaseDB_HOSTNAME
- the hostname of the postgres databaseDB_USERNAME
- the username of the postgres databaseDB_PASSWORD
- the password to used for the postgres databaseDB_PORT
- the port that the postgres database is exposed on