A tool to check up on data stored in dotStorage, ensuring it is available and retrievable. It continuously takes random samples of CIDs and exposes prometheus metrics on their availability.
Drop a .env
file in the project root and populate:
DATABASE_CONNECTION=<value>
IPFS_CHECK_API_URL=<value>
CLUSTER_API_URL=<value>
CLUSTER_BASIC_AUTH_TOKEN=<value>
PORT=3000 # optional, default shown
PROM_NAMESPACE=checkup # optional, default shown
SAMPLE_METHOD=universal # optional, default shown, also randomid (nft.storage only)
ELASTIC_PROVIDER_ADDR=/p2p/Qm... # optional, if set, CIDs will be checked on elastic provider also (assumed ALL CIDs are available here)
ELASTIC_PROVIDER_S3_REGION=<value> # optional, required if ELASTIC_PROVIDER_ADDR is set
ELASTIC_PROVIDER_S3_BUCKET=<value> # optional, required if ELASTIC_PROVIDER_ADDR is set
ELASTIC_PROVIDER_S3_ACCESS_KEY_ID=<value> # optional, required if ELASTIC_PROVIDER_ADDR is set
ELASTIC_PROVIDER_S3_SECRET_ACCESS_KEY=<value> # optional, required if ELASTIC_PROVIDER_ADDR is set
Replace the following values as specified:
DATABASE_CONNECTION
with the connection string for the database you want read from.IPFS_CHECK_URL
with an ipfs-check backend API URL.CLUSTER_API_URL
with the base URL of the Cluster API.CLUSTER_BASIC_AUTH_TOKEN
with the base64 encoded basic auth token for the Cluster API.
Start the checker:
npm start
Metrics for reports are available at http://localhost:3000/metrics
There's a Dockerfile
that runs the tool in docker.
docker build -t checkup .
docker run -d checkup