2023
DOI: 10.48550/arxiv.2302.02080
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Improving Prediction Backward-Compatiblility in NLP Model Upgrade with Gated Fusion

Abstract: When upgrading neural models to a newer version, new errors that were not encountered in the legacy version can be introduced, known as regression 1 errors. This inconsistent behavior during model upgrade often outweighs the benefits of accuracy gain and hinders the adoption of new models. To mitigate regression errors from model upgrade, distillation and ensemble have proven to be viable solutions without significant compromise in performance. Despite the progress, these approaches attained an incremental red… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 36 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?