Skip to content

Code associated with Tuning Language Models by Proxy (Liu et al., 2024)

Notifications You must be signed in to change notification settings

alisawuffles/proxy-tuning

Repository files navigation

Tuning Language Models by Proxy

This repository contains code for the paper Tuning Language Models by Proxy (2024). If you have any questions, please feel free to create a Github issue or reach out to the first author at alisaliu@cs.washington.edu.

Evaluation

You can download the evaluation data at zip file of our evaluation data. Our evaluation setup is largely borrowed from Tülu 2 (codebase at https://github.com/allenai/open-instruct), with slight modifications. To see examples of how evaluation scripts are run, see scripts/eval.

Citation

@misc{liu-etal-2024-tuning,
  title={Tuning Language Models by Proxy}, 
  author={Alisa Liu and Xiaochuang Han and Yizhong Wang and Yulia Tsvetkov and Yejin Choi and Noah A. Smith},
  year={2024},
  eprint={2401.08565},
  archivePrefix={arXiv},
  url={https://arxiv.org/abs/2401.08565}
}

About

Code associated with Tuning Language Models by Proxy (Liu et al., 2024)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published