Clusterjob, hereafter CJ, is an experiment management system (EMS) for data science. CJ is written mainly in perl and allows submiting computational jobs to clusters in a hassle-free and reproducible manner. CJ produces 'reporoducible' computational packages for academic publications at no-cost. CJ project started in 2013 at Stanford University by Hatef Monajemi and his PhD advisor David L. Donoho with the goal of encouraging more efficient and reproducible research paradigm. CJ is currently under active development. Current implementation allows submission of MATLAB,Python and R jobs. The code for R works partially for serial jobs only. In the future versions, we hope to include other data science programming languages such as Julia.
You can read more about CJ on http://clusterjob.org
You can find CJ book project at https://github.com/monajemi/CJ-book
- Hatef Monajemi
- Bekk Blando
- David Donoho
- Vardan Papyan
@article{clusterjob,
Author = {H.~Monajemi and D.~L.~Donoho},
Month = March,
Url= {https://github.com/monajemi/clusterjob},
Title = {ClusterJob: An automated system for painless and reproducible massive computational experiments},
Year = 2015}
@article{MMCEP17,
title = {Making massive computational experiments painless},
author = {H.~Monajemi and D.~L.~Donoho and V.~Stodden},
journal={Big Data (Big Data), 2016 IEEE International Conference on},
year={2017},
month={February},
}
@article{Monajemi19,
title = {Ambitious data science can be painless},
author = {H.~Monajemi and R.~Murri and E.~Yonas and P.~Liang and V.~Stodden and D.L.~Donoho},
note={arXiv:1901.08705},
year={2019},
}
Copyright 2015 Hatef Monajemi (monajemi@stanford.edu)