Skip to content

PiML (Python Interpretable Machine Learning) toolbox for model development and validation.

Notifications You must be signed in to change notification settings

mederrata/PiML-Toolbox

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

86 Commits
 
 
 
 
 
 

Repository files navigation

drawing

A low-code interpretable machine learning toolbox in Python

PiML (or π·ML, /ˈpaɪ·ˈem·ˈel/) is a new Python toolbox for Interpretable Machine Learning model development and validation. Through low-code automation and high-code programming, PiML supports various machine learning models in the following two categories:

  • Inherently interpretable models:
  1. EBM: Explainable Boosting Machine (Nori, et al. 2019; Lou, et al. 2013)
  2. GAMI-Net: Generalized Additive Model with Structured Interactions (Yang, Zhang and Sudjianto, 2021)
  3. ReLU-DNN: Deep ReLU Networks using Aletheia Unwrapper (Sudjianto, et al. 2020)
  • Arbitrary black-box models,e.g.
  1. LightGBM or XGBoost of varying depth
  2. RandomForest of varying depth
  3. Residual Deep Neural Networks

Installation

Run the following piece of script to download and install PiML v0.1.0 to Google Colab:

!pip install wget
import wget
url = "https://github.com/SelfExplainML/PiML-Toolbox/releases/download/V0.1.0/PiML-0.1.0-cp37-cp37m-linux_x86_64.whl"
wget.download(url, 'PiML-0.1.0-cp37-cp37m-linux_x86_64.whl')
!pip install PiML-0.1.0-cp37-cp37m-linux_x86_64.whl

Low-code Usage on Google Colab

Stage 1: Initialize an experiment, Load and Prepare data

from piml import Experiment
exp = Experiment(platform="colab")
exp.data_loader()

exp.data_summary()

exp.data_prepare()

exp.eda()

Stage 2: Train intepretable models

exp.model_train()

Stage 3. Explain and Interpret

exp.model_explain()

exp.model_interpret() 

Stage 4. Diagnose and Compare

exp.model_diagnose()

exp.model_compare()

Arbitrary Black-Box Modeling

For example, train a complex LightGBM with depth 7 and register it to the experiment:

from lightgbm import LGBMRegressor
pipeline = exp.make_pipeline(LGBMRegressor(max_depth=7))
pipeline.fit() 
exp.register(pipeline=pipeline, name='LGBM')

Then, compare it to inherently interpretable models (e.g. EBM and GAMI-Net):

exp.model_compare()

Citations

"PiML: A Python Toolbox for Interpretable Machine Learning Model Development and Validation" (Sudjianto, Zhang*, Yang, Su, Zeng and Nair, 2022)

@article{sudjianto2022piml,
  title={PiML: A Python Toolbox for Interpretable Machine Learning Model Development and Validation},
  author={Sudjianto, Agus and Zhang*, Aijun and Yang, Zebin and Su, Yu and Zeng, Ningzhou and Nair Vijay},
  journal={To appear},
  year={2022}
}

About

PiML (Python Interpretable Machine Learning) toolbox for model development and validation.

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Jupyter Notebook 100.0%