Move to this New Link: https://github.com/westlake-repl/Recommendation-Systems-without-Explicit-ID-Features-A-Literature-Review
Papers of universal user representations (lifelong) learning for item recommendations https://zhuanlan.zhihu.com/p/437671278 ( 以下内容参考知乎: 推荐系统通用用户表征预训练与迁移学习研究进展)
Four Large-scale Recommendation datasets for evaluating cross-domain recommendation models or transferable recommendaiton models
(1) PixelRec: https://github.com/westlake-repl/PixelRec
(2) NineRec: https://github.com/westlake-repl/NineRec
(3) MicroLens: https://github.com/westlake-repl/MicroLens
(4) Tenrec: https://github.com/yuangh-x/2022-NIPS-Tenrec
Our research papers that apply pre-training and transfer learning to learn universal user representations for recommender systems:
1 Parameter-Efficient Transfer from Sequential Behaviors for User Modeling and Recommendation SIGIR2020 https://arxiv.org/abs/2001.04253
With github codes and datasets: https://github.com/fajieyuan/SIGIR2020_peterrec
Keywords: self-supervised learning, user sequential behaviors, pretraining, transfer learning, universal user representation, user profile prediction, cold-start problem
(1) We are the first to evidence that self-supervised user behavior pre-training helps many downstream tasks.
(2) We are also the first to provide user profile prediction as a way to assess the universal or generic property of user representation
(3) We release a large-scale public dataset for user representation transer learning and source code.
2 One Person, One Model, One World: Learning Continual User Representation without Forgetting SIGIR2021 https://arxiv.org/abs/2009.13724
With github codes and datasets: https://github.com/fajieyuan/SIGIR2021_Conure
Keywords: self-supervised learning, lifelong learning, pre-training, transfer learning, fine-tuning, general-purpose user representation, user profile prediction, cold-start recommendation
(1) We are the first to propose universal lifelong user representation learning mechanism for recommender system
(2) We are the first to clearly demonstrate the catastrophic forgetting and over-parameterization issues in recommender sytem.
(3) We release the dataset for lifelong user representation learning and source code.
3 Learning Transferable User Representations with Sequential Behaviors via Contrastive Pre-training ICDM2021 https://fajieyuan.github.io/papers/ICDM2021.pdf
Keywords: contrative learnng, self-supervised learning, transfer learning, pretraining, finetuning, general-purpose user representation, user profile prediction, cold-start problem
4 User-specific Adaptive Fine-tuning for Cross-domain Recommendations TKDE2021 https://arxiv.org/pdf/2106.07864.pdf
Keywords: adaptive fine-tuning, pre-training, cold-start problem, cross-domain recommendation, general-purpose user representation
5 Transrec: learning transferable recommendation from mixture-of-modality feedback https://arxiv.org/pdf/2206.06190.pdf
Keywords: foundation recommendation models, pre-training, transfer learning, mixture-of-modality, content-based recommendation
https://github.com/fajieyuan/recommendation_dataset_pretraining
1 Scaling Law for Recommendation Models: Towards General-purpose User Representations NAVER CLOVA 2021
2 One4all User Representation for Recommender Systems in E-commerce NAVER CLOVA 2021
3 Knowledge Transfer via Pre-training for Recommendation Tsinghua University 2021 frontiers
4 Self-supervised Learning for Large-scale Item Recommendations Google 2021
5 UserBERT: Self-supervised User Representation Learning Reject by ICLR2021
6 UPRec: User-Aware Pre-training for Recommender Systems AAAI2021
7 Personalized Transfer of User Preferences for Cross-domain Recommendation WSDM2022
8 Perceive your users in depth: Learning universal user representations from multiple e-commerce tasks Ailabab KDD2019
9 Pre-Training Graph Neural Networks for Cold-Start Users and Items Representation
10 Recommendation as Language Processing (RLP): A Unified Pretrain, Personalized Prompt & Predict Paradigm (P5)