2023
DOI: 10.48550/arxiv.2303.03103
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Towards Zero-Shot Functional Compositionality of Language Models

Abstract: Large Pre-trained Language Models (PLM) have become the most desirable starting point in the field of NLP, as they have become remarkably good at solving many individual tasks. Despite such success, in this paper, we argue that current paradigms of working with PLMs are neglecting a critical aspect of modeling human intelligence: functional compositionality. Functional compositionality -the ability to compose learned tasks -has been a long-standing challenge in the field of AI (and many other fields) as it is … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 43 publications
(63 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?