Findings of the Association for Computational Linguistics: EMNLP 2020 2020
DOI: 10.18653/v1/2020.findings-emnlp.174
|View full text |Cite
|
Sign up to set email alerts
|

PTUM: Pre-training User Model from Unlabeled User Behaviors via Self-supervision

Abstract: User modeling is critical for many personalized web services. Many existing methods model users based on their behaviors and the labeled data of target tasks. However, these methods cannot exploit useful information in unlabeled user behavior data, and their performance may be not optimal when labeled data is scarce. Motivated by pre-trained language models which are pre-trained on large-scale unlabeled corpus to empower many downstream tasks, in this paper we propose to pretrain user models from large-scale u… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
19
0

Year Published

2021
2021
2025
2025

Publication Types

Select...
5
3

Relationship

3
5

Authors

Journals

citations
Cited by 21 publications
(19 citation statements)
references
References 15 publications
0
19
0
Order By: Relevance
“…Personalized news recommendation is an important research problem and has been widely studied in recent years (Konstan et al, 1997;Wang and Blei, 2011;Liu et al, 2010;Bansal et al, 2015;Wu et al, 2020b;Qi et al, 2021c;Wu et al, 2020dWu et al, , 2021dWang et al, 2020;Ge et al, 2020;.…”
Section: Personalized News Recommendationmentioning
confidence: 99%
“…Personalized news recommendation is an important research problem and has been widely studied in recent years (Konstan et al, 1997;Wang and Blei, 2011;Liu et al, 2010;Bansal et al, 2015;Wu et al, 2020b;Qi et al, 2021c;Wu et al, 2020dWu et al, , 2021dWang et al, 2020;Ge et al, 2020;.…”
Section: Personalized News Recommendationmentioning
confidence: 99%
“…In recent years, there are a few works on pretraining user models using unlabeled user behavior data via self-supervision (Wu et al, 2020a;Yuan et al, 2020;Xie et al, 2020). For example, Wu et al (2020a) proposed a PTUM approach that uses a masked behavior prediction task and a next K behaviors prediction tasks to pre-train user models.…”
Section: User Model Pre-trainingmentioning
confidence: 99%
“…In recent years, there are a few works on pretraining user models using unlabeled user behavior data via self-supervision (Wu et al, 2020a;Yuan et al, 2020;Xie et al, 2020). For example, Wu et al (2020a) proposed a PTUM approach that uses a masked behavior prediction task and a next K behaviors prediction tasks to pre-train user models. Yuan et al (2020) proposed a PeterRec approach that pre-trains user models in a masked behavior prediction task and an auto-regressive task that successively predicts user behaviors based on past ones.…”
Section: User Model Pre-trainingmentioning
confidence: 99%
See 1 more Smart Citation
“…User interest modeling is a critical step for personalized news recommendation (Wu et al, 2021;Zheng et al, 2018;Wu et al, 2020c). Existing methods usually learn a single representation vector to model overall user interests from users' clicked news (Okura et al, 2017;Wu et al, 2020b;.…”
Section: Introductionmentioning
confidence: 99%