2021 IEEE/ACM 18th International Conference on Mining Software Repositories (MSR) 2021
DOI: 10.1109/msr52588.2021.00045
|View full text |Cite
|
Sign up to set email alerts
|

Fast and Memory-Efficient Neural Code Completion

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
16
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
3
2

Relationship

1
8

Authors

Journals

citations
Cited by 47 publications
(17 citation statements)
references
References 24 publications
1
16
0
Order By: Relevance
“…Svyatkovskiy et al [59] proposed a different perspective on neural code completion, shifting from a generative task to a learning-to-rank task. Their model is used to rerank the recommendations provided via static analysis, being cheaper in terms of memory footprint as compared to generative models.…”
Section: Related Workmentioning
confidence: 99%
“…Svyatkovskiy et al [59] proposed a different perspective on neural code completion, shifting from a generative task to a learning-to-rank task. Their model is used to rerank the recommendations provided via static analysis, being cheaper in terms of memory footprint as compared to generative models.…”
Section: Related Workmentioning
confidence: 99%
“…Svyatkovskiy et al [69] proposed a different perspective on neural code completion, shifting from a generative task to a learning-to-rank task. Their model is used to rerank the recommendations provided via static analysis, being cheaper in terms of memory footprint than generative models.…”
Section: Code Completion Approachesmentioning
confidence: 99%
“…To evaluate quantitatively our model, we use two commonly used metrics in very similar tasks such as code completion [16,37,39], namely the Recall@k and the Mean Reciprocal Rank (MRR). We consider a recommendation as…”
Section: Effectiveness Metricsmentioning
confidence: 99%