Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence 2018
DOI: 10.24963/ijcai.2018/578
|View full text |Cite
|
Sign up to set email alerts
|

Code Completion with Neural Attention and Pointer Networks

Abstract: Intelligent code completion has become an essential research task to accelerate modern software development. To facilitate effective code completion for dynamically-typed programming languages, we apply neural language models by learning from large codebases, and develop a tailored attention mechanism for code completion. However, standard neural language models even with attention mechanism cannot correctly predict the outof-vocabulary (OoV) words that restrict the code completion performance. In this paper, … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
166
0
1

Year Published

2018
2018
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 141 publications
(168 citation statements)
references
References 20 publications
1
166
0
1
Order By: Relevance
“…The attention mechanism allows a neural language model to retrieve and make use of pertinent information in all previous hidden states, improving network retention. The mathematical details of the attention mechanism are described in previous work [38]. For attention, we use external memory M for previous hidden states, which is denoted as…”
Section: Proposed Lstm-attm Model Architecturementioning
confidence: 99%
“…The attention mechanism allows a neural language model to retrieve and make use of pertinent information in all previous hidden states, improving network retention. The mathematical details of the attention mechanism are described in previous work [38]. For attention, we use external memory M for previous hidden states, which is denoted as…”
Section: Proposed Lstm-attm Model Architecturementioning
confidence: 99%
“…[90][91][92][93][94][95][96][97][98][99][100][101][102] Paradigms of program synthesis. Paradigms [84,[103][104][105][106][107][108][109][110][111][112][113][114][115] Code completion and suggestion.…”
Section: [49]mentioning
confidence: 99%
“…Recent trends feature a boom by applying the Deep Learning Network (DNN) instead of LMs. However, while many studies [4], [13], [17], [22] have been developed to accommodate DNN in their code suggestion systems, a study [9] from Hellendoorn and Devanbu showed that carefully adapting n-gram models for source code can yield better performance than deep-learning models.…”
Section: Related Workmentioning
confidence: 99%
“…Existing solutions to code completion focus on using neural language models [4], [9], [13], [17], [22] or statistical language models [3], [16], [18], [19] learned from a large code base by modeling it into a natural language processing problem. However, they fail to utilize the user's input to narrow down the candidate list by proper relevance ranking.…”
Section: Introductionmentioning
confidence: 99%