Kullback-Leibler projections for Bayesian model selection in Python.
Kulprit (Pronounced: kuːl.prɪt) is a package for variable selection for Bambi models. Kulprit is under active development so use it with care. If you find any bugs or have any feature requests, please open an issue.
Kulprit requires a working Python interpreter (3.9+). We recommend installing Python and key numerical libraries using the Anaconda Distribution, which has one-click installers available on all major platforms.
Assuming a standard Python environment is installed on your machine (including pip), Kulprit itself can be installed in one line using pip:
pip install kulprit
Alternatively, if you want the bleeding edge version of the package you can install it from GitHub:
pip install git+https://github.com/bambinos/kulprit.git
The Kulprit documentation can be found in the official docs. If you are not familiar with the theory behind Kulprit or need some practical advice on how to use Kulprit or interpret its results, we recommend you read the paper Robust and efficient projection predictive inference. You may also find useful this guide on Cross-Validation and model selection.
Read our development guide in CONTRIBUTING.md.
Kulprit is a community project and welcomes contributions. Additional information can be found in the Contributing Readme.
For a list of contributors see the GitHub contributor page
If you use Bambi and want to cite it please use
@misc{mclatchie2023,
title={Robust and efficient projection predictive inference},
author={Yann McLatchie and Sölvi Rögnvaldsson and Frank Weber and Aki Vehtari},
year={2023},
eprint={2306.15581},
archivePrefix={arXiv},
primaryClass={stat.ME}
}
If you want to support Kulprit financially, you can make a donation to our sister project PyMC.
Kulprit wishes to maintain a positive community. Additional details can be found in the Code of Conduct