DarkSight is a dimension reduction technique to visualize any black-box classifier.
Please visit here for more information (the related paper, main results, exemplar visualization and demos).
03 July 2018: DarkSight is compatible with PyTorch 0.4.0 now!
This repository contains a PyTorch implementation of DarkSight.
Download or clone this repository and put the folder darksight
in the same path as the Python file need to import it.
- Python 2.7.12
- Python 3.x is not supported
- PyTorch 0.4.0
- PyTorch 0.2.x or 0.3.x not supported
- NumPy 1.13.3
- Matplotlib 2.0.2
A demo is provided in this repository to illustrate the basic use of DarkSight, using the provided output from a 98.2% accuracy LeNet trained on MNIST dataset.
darksight.Knowledge(output, T=1)
is the class that wraps a clssifier's output.
Parameters
output
: the classifier's output; should be anumpy.array
- It can be either the predictive probabilities or the logit before softmax
Optional
T
: the temperature used when normalizing the provided output
darksight.DarkSight(klg)
is the class for the DarkSight proposed in the paper.
It defines a Naive Bayes model on 2D and performs non-parametric dimension reduction and model compression jointly based on a symmetric KL divergence objective.
Parameters
klg
is the knowledge object wrapped bydarksight.Knowledge
darksight.DarkSight.train(num_epoch, lrs, batch_size=1000, verbose_skip=100, do_annealing=False, annealing_length=1000, highest_T=10, annealing_stepsize=100)
Parameters
num_epoch
: number of epochs for traininglrs
: learning rates for each component, conditional distribution, low-dimensional embedding and prior distribution, in a list, i.e.[lr_cond, lr_y, lr_prior]
Optional
batch_size
: batch size for traningverbose_skip
: number of epochs for printing training logsdo_annealing
: whether to do annealing or notannealing_length
: length of epoch for annealinghighest_T
: start temperature of annealingannearling_stepsize
: the step size of calculating new temperature
Helper function to plot the loss trace.
Helper function to print the learnt low-dimensional embeddings as a scatter plot.
Optional
color_on
: color or monotonic?mu_on
: plot means of each cluster?labels
: a Python list of string for labels used in the plotcontour_on
: plot contour based on P(y)?- Note that generating contour requires running the model so we need to specify whether and how to use GPU for this purpose below
use_cuda
: use GPU?gpu_id
: which GPU to use?contour_slices
: how fine do you want your contour to be computed?contour_num
: how many contour levels you want?
Low-dimensional embedding learnt.
Means of each conditional distribution learnt.
Precision matrix of each conditional distribution learnt.
Parameters of the prior distribution.
Helper function to output the training results.
Parameters
output_file_path
: output file path
Kai Xu is the first author of the corresponding paper and the maintainer of this library, feel free to contact him on the paper or the library by email: kai.xu@ed.ac.uk.