Skip to content

Job Manager API and UI for interacting with asynchronous batch jobs and workflows.

License

Notifications You must be signed in to change notification settings

cpavanrun/job-manager

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Job Manager

CircleCI branch Github GitHub GitHub release Github Github Github

User facing documentation

Welcome to the Job Manager repository! If you're a developer you're in the right place.

However, if you just want to try out or deploy Job Manager, you will probably find our user and deployment focused content in our ReadTheDocs pages: https://data-biosphere-job-manager.readthedocs.io/en/latest/

Try it out, NOW!

The easiest way to try out Job Manager is to use the getting started script

Welcome

See the development guide below.

The Job Manager is an API and UI for monitoring and managing jobs in a backend execution engine.

The Broad, Verily, and many other organizations in the life sciences execute enormous numbers of scientific workflows and need to manage those operations. Job Manager was born out of the experiences of producing data for some of the world’s largest sequencing projects such as The Cancer Genome Atlas, Baseline, and the Thousand Genomes Project.

The Job Manager aspires to bring ease and efficiency to developing and debugging workflows while seamlessly scaling to production operations management.

Key Features

  • Supports visualization over Cromwell or dsub backends
  • Service provider interface can be extended to support other engines
  • Rich search capabilities across current and historic workflows
  • Aborting workflows
  • Clean, intuitive UX based on material design principles

Future Features

  • Dynamic grouping, filtering, and drill-down
  • Re-launching workflows
  • Simplified troubleshooting of failed workflows
  • Improved UI design

Roadmap

The current code is a work in progress towards an alpha release and as such has started with core features: connecting to both backends, visualizing workflow and task status and metadata, quick access to log files, and simple filtering.

The near-term roadmap includes improvements to failure troubleshooting, creating a robust dashboard for grouping jobs and seeing status overviews, and improving handling of widely scattered workflows.

We envision a product with user-customizable views of jobs running, insights into workflow compute cost, the ability to re-launch jobs, and the potential to make custom reports about the jobs that have been run.

Architecture Overview

The Job Manager defines an API via OpenAPI. An Angular2 UI is provided over the autogenerated Typescript bindings for this API. The UI is configurable at compilation time to support various deployment environments (see environment.ts), including auth, cloud projects, and label columns.

The UI must be deployed along with a backend implementation of the API, two such implementations are provided here:

Cromwell

Monitors jobs launched by the Cromwell workflow engine. The Python Flask wrapper was created using Swagger Codegen and can be configured to pull data from a specific Cromwell instance. At this time, to utilize all job manager features, please consider using Cromwell v32 or newer.

dsub

Monitors jobs that were launched via the dsub CLI. Thin stateless wrapper around the dsub Python library. Authorization is required for deploying the UI, which is used to communicate with the Google Genomics Pipelines API. The wrapper itself is implemented in Python Flask using Swagger codegen models. A Dockerfile is provided which serves for production deployment using gunicorn.

Note that a “task” in dsub nomenclature corresponds to a Job Manager API’s “job”.

Development

Prerequisites

  • Install docker and docker-compose

  • Check out the repository and navigate to the directory:

      git clone https://github.com/DataBiosphere/job-manager.git
      cd job-manager
  • Setup git-secrets on the repository:

    • On Mac:
    brew install git-secrets
    
    • On Linux:
    rm -rf git-secrets
    git clone https://github.com/awslabs/git-secrets.git
    cd git-secrets
    sudo make install && sudo chmod o+rx /usr/local/bin/git-secrets
    cd ..
    rm -rf git-secrets
    
  • Configure the git secrets hook:

      git secrets --install

Server Setup

  • Choose your own adventure: cromwell (local or CaaS) or dsub!

Cromwell

  • Link your preferred backend docker compose file as docker-compose.yml:

    • Cromwell (local): ln -sf cromwell-instance-compose.yml docker-compose.yml
    • Cromwell (CaaS): ln -sf cromwell-caas-compose.yml docker-compose.yml
  • Follow servers/cromwell for Cromwell server setup then return here to continue.

dsub

  • Link the dsub docker compose file as docker-compose.yml:
ln -sf dsub-local-compose.yml docker-compose.yml
  • If you prefer not to create a symbolic link, use:
docker-compose -f dsub-google-compose.yml CMD

Run Locally

  • Run docker-compose up from the root of the repository:
    • If this is the first time running docker-compose up this might take a few minutes.
    • Eventually you should see a compilation success message like this:
    jmui_1        | webpack: Compiled successfully.
    
  • Make sure that your backend (eg the Cromwell service or dsub) is ready to receive query requests.
  • Navigate to http://localhost:4200.

Notes

  1. Websocket reload on code change does not work in docker-compose (see angular/angular-cli#6349).
  2. Changes to package.json or requirements.txt or regenerating the API require a rebuild with:
docker-compose up --build

Alternatively, rebuild a single component:

docker-compose build ui

Updating the API using swagger-codegen

  • We use swagger-codegen to transform the API defined in api/jobs.yaml into appropriate classes for the servers and the UI to use.
  • Whenever the API is updated, run this to trigger a rebuild:
docker-compose up rebuild-swagger

Swagger codegen notes

  • The rebuild-swagger job does nothing if the file api/jobs.yaml has not changed since the last time it was run.
  • The rebuild-swagger job will run by default during docker-compose up to generate the swagger for the other services if necessary. The other services will not start until their swagger classes exist.
  • After regenerating the model files, you'll need to test and update the server implementations to resolve any broken dependencies on old API definitions or implement additional functionality to match the new specs.

Job Manager UI Server

For UI server documentation, see ui.

Job Manager dsub Server

For dsub server documentation, see servers/dsub.

Job Manager cromwell Server

For cromwell server documentation, see servers/cromwell.

Build docker images and releases

How to build

From v0.2.0, Job Manager starts to release stock docker images on DockerHub

  • Setting the docker tag first in bash, e.g. TAG="v0.1.0"

  • To build the job-manager-ui image with $TAG from the root of this Github repository:

    docker build -t job-manager-ui:$TAG . -f ui/Dockerfile
    
  • Cromwell: To build the job-manager-api-cromwell image with $TAG from the root of this Github repository:

    docker build -t job-manager-api-cromwell:$TAG . -f servers/cromwell/Dockerfile
    
  • dsub: To build the job-manager-api-dsub image with $TAG from the root of this Github repository:

    docker build -t job-manager-api-dsub:$TAG . -f servers/dsub/Dockerfile
    

Add a github release pointing to the dockerhub images

From v0.2.0, each release in Github will also release 3 corresponding docker images on Docker Hub:

For a long-term plan, we will set up a docker build hook so the release process can be more automated.

About

Job Manager API and UI for interacting with asynchronous batch jobs and workflows.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • TypeScript 46.5%
  • Python 32.8%
  • HTML 8.4%
  • CSS 7.2%
  • Shell 3.5%
  • WDL 0.8%
  • Other 0.8%