Skip to content

Latest commit

 

History

History
256 lines (194 loc) · 26 KB

README.md

File metadata and controls

256 lines (194 loc) · 26 KB

CS692 Seminar: Systems for Machine Learning, Machine Learning for Systems

Course website: https://guanh01.github.io/teaching/2020-fall-mlsys

This is the (evolving) reading list for the seminar. The papers are from top ML venues (ICML, ICLR, etc) and system venues (ASPLOS, PLDI, etc). The selection criteria is whether some keywords are in paper title.

Topics of interest include, but are not limited to (copied from MLSys website):

  • Efficient model training, inference, and serving
  • Distributed and parallel learning algorithms
  • Privacy and security for ML applications
  • Testing, debugging, and monitoring of ML applications
  • Fairness, interpretability and explainability for ML applications
  • Data preparation, feature selection, and feature extraction
  • ML programming models and abstractions
  • Programming languages for machine learning
  • Visualization of data, models, and predictions
  • Specialized hardware for machine learning
  • Hardware-efficient ML methods
  • Machine Learning for Systems

Table of Contents

Systems for Machine Learning

Distributed and Parallel Learning

Efficient Training

DNN Training

GNN Training

Neural Architecture Search

Continous Learning

Efficient Inference

Compiler

Resource Management

Compression

Pruning

Quantization

Model Serving

Testing and Debugging

  • [MLSys'20]Model Assertions for Monitoring and Improving ML Models The paper tries to monitor and improve ML models by using model assertions at all stages of ML system delopyment, including runtime monitoring and validating labels. For runtime monitoring, model assertions can find high confidence errors. For training, they propose a bandit-based active learning algorithm that can sample from data flagged by assertion to reduce labeling cost.

Robustness

Other Metrics (Interpretability, Privacy, etc.)

Data Preparation

ML programming models

Machine Learning for Systems

ML for ml system

ML for compiler

ML for programming languages

  • [PLDI'20]Learning Nonlinear Loop Invariants with Gated Continuous Logic Networks The paper proposes a new neuro architecture (Gated Continuous Logic Network(G-CLN)) to learn nonlinear loop invariants. Utilizing DNN to solve and understand the system issue.
  • [PLDI'20]Blended, Precise Semantic Program Embeddings This paper is utilizing ML for systems. Basically, it utilizes DNN to learn program embeddings, vector representations of pro-gram semantics. Existing approaches predominately learn to embed programs from their source code, and, as a result, they do not capture deep, precise program semantics. On the other hand, models learned from runtime information critically depend on the quality of program executions, thus leading to trainedmodels with highly variant quality. LiGer learns programrepresentations from a mixture of symbolic and concrete exe-cution traces.
  • [ICLR'18]Learning to Represent Programs with Graphs

ML for memory management

General Reports

Other Resources