To isolate dependencies from your global python installation, it is important to use a tool like
virtualenv. With virtualenv
you can install the dev environment by doing the following.
Another option would be to use docker directly, i.e.
docker run -it -v `echo $PWD`:/root python:3.6.8 bash
docker run -it -v `echo $PWD`:/root python:2.7.12 bash
pip install -e .
pip install -r dev-requirements-py3.txt
To verify that the installation of databricks-cli
is the one checked out from VCS, you can check by doing python -c "import databricks_cli; print databricks_cli.__file__"
.
This repo comes pre-configured with a devolpment container for the VSCode Remote Containers extension. When opening this project in VSCode you will be asked if you want to open it in a dev container. Click yes and VSCode will build a docker container which everything needed to develop the Databricks CLI and attach VSCode to the container.
Requirements:
- VSCode with the remote containers extension installed
- A working docker installation
The same development container setup used for local VSCode also works with GitHub CodeSpaces. If you have CodeSpaces enabled in your Github account then can just create a CodeSpace from the repoand start coding.
In order to test the CLI against a Databricks cluster you can define the these secrets for your CodeSpace so you don't have to run databricks init
eacht time you open it:
DATABRICKS_HOST
: Workspace URLDATABRICKS_TOKEN
: Personal access token
tox