本书最新网址 https://junchaoiu.github.io/NLP-Learning-Notes ,欢迎大家访问 ~
NLP笔记,入门概念,基础知识,研究方法,顶会研读
作者研究导向:
- 机器翻译
- 知识图谱
科研于我是一段孤独的道路
路上总会遇到很多很多人
希望我也有能力可以拉你一把
希望你也愿意和我一起走过一段路程
-
知识图谱技术
-
机器翻译
-
- Word Translation Without Parallel Data (Alexis Conneau, ICLR, 2018)
- Unsupervised Machine Translation Using Monolingual Corpora Only (Guillaume Lample, ICLR, 2018)
- Phrase-Based & Neural Unsupervised Machine Translation (Guillaume Lample, EMNLP, 2018)
- Adapting High-resource NMT Models to Translate Low-resource Related Languages without Parallel Data (Wei-Jen Ko, ACL, 2021)
- A Retrieve-and-Rewrite Initialization Method for Unsupervised Machine Translation (Shuo Ren, ACL, 2020)
- Multilingual Unsupervised Neural Machine Translation with Denoising Adapters (Ahmet Üstün, EMNLP, 2021)
- Reusing a Pretrained Language Model on Languages with Limited Corpora for Unsupervised NMT (Alexandra, EMNLP, 2020)
- Cross-lingual Language Model Pretraining (Guillaume Lample, NeurIPS, 2019)
-
[低资源机器翻译]
- Low-resource Neural Machine Translation with Cross-modal Alignment (Zhe Yang, EMNLP, 2022)
- SMaLL-100: Introducing Shallow Multilingual Machine Translation Model for Low-Resource Languag (Alireza Mohammadshahi, EMNLP, 2022)
- ConsistTL: Modeling Consistency in Transfer Learning for Low-Resource Neural Machine Translation (Zhaocong Li, EMNLP, 2022)
-
-
未分类
-
模型压缩
由于个人水平有限,笔记中难免有笔误甚至概念错误之处,请各位不吝赐教,在issue中提出来。
如遇到问题,请致邮(Email):wujunchaoIU@outlook.com