Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Confer 2021
DOI: 10.18653/v1/2021.acl-long.285
|View full text |Cite
|
Sign up to set email alerts
|

MPC-BERT: A Pre-Trained Language Model for Multi-Party Conversation Understanding

Abstract: Recently, various neural models for multiparty conversation (MPC) have achieved impressive improvements on a variety of tasks such as addressee recognition, speaker identification and response prediction. However, these existing methods on MPC usually represent interlocutors and utterances individually and ignore the inherent complicated structure in MPC which may provide crucial interlocutor and utterance semantics and would enhance the conversation understanding process. To this end, we present MPC-BERT, a p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
7
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
2
1

Relationship

2
7

Authors

Journals

citations
Cited by 21 publications
(28 citation statements)
references
References 15 publications
0
7
0
Order By: Relevance
“…Wang et al (2020b) proposed to track the dynamic topic in a conversation. Gu et al (2021) proposed jointly learning "who says what to whom" in a unified framework by designing self-supervised tasks during pre-training. On the other hand, explored generation-based approaches by proposing a graphstructured network, the core of which was an utterance-level graph-structured encoder.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Wang et al (2020b) proposed to track the dynamic topic in a conversation. Gu et al (2021) proposed jointly learning "who says what to whom" in a unified framework by designing self-supervised tasks during pre-training. On the other hand, explored generation-based approaches by proposing a graphstructured network, the core of which was an utterance-level graph-structured encoder.…”
Section: Related Workmentioning
confidence: 99%
“…At first, researchers mostly focused on dialogues between two participants (Shang et al, 2015;Serban et al, 2016;Wen et al, 2017;Young et al, 2018). Recently, researchers have paid more attention to a more practical and challenging scenario involving more than two participants, which is well known as multi-party conversations (MPCs) (Ouchi and Tsuboi, 2016;Zhang et al, 2018;Le et al, 2019;Wang et al, 2020b;Gu et al, 2021). Utterances in a two-party conversation are posted one by one between two interlocutors, constituting a sequential information flow.…”
Section: Introductionmentioning
confidence: 99%
“…In a multi-party chat stream (Traum, 2004;Uthus and Aha, 2013;Ouchi and Tsuboi, 2016;Gu et al, 2021), messages related to different topics are entangled with each other, which makes it difficult for a new user to understand the context of the discussion in the chat room. Dialogue disentanglement (Kummerfeld et al, 2019;Gu et al, 2020b;Yu and Joty, 2020;Liu et al, 2021a,b) aims at disentangling a whole conversation into several threads from a data stream so that each thread is about a specific topic.…”
Section: Introductionmentioning
confidence: 99%
“…Recently, semantic representations from pretrained language models have achieved remarkable * Work done as an intern at Tencent AI Lab. success on a spectrum of dialogue tasks (Wen et al, 2015;Gu et al, 2021;Zeng et al, 2021;Zhang and Zhao, 2021;Cui et al, 2021), where knowledge learned in pre-training over large-scale dialogue corpora can be transferred to downstream applications. Current pre-training techniques typically focus on the surface text.…”
Section: Introductionmentioning
confidence: 99%