Binder, FelixVillmow, JohannesUlges, AdrianReussner, Ralf H.Koziolek, AnneHeinrich, Robert2021-01-272021-01-272021978-3-88579-701-2https://dl.gi.de/handle/20.500.12116/34796This paper investigates the use of transformer networks – which have recently become ubiquitous in natural language processing – for smart autocompletion on source code. Our model JavaBERT is based on a RoBERTa network, which we pretrain on 250 million lines of code and then adapt for method ranking, i.e. ranking an object's methods based on the code context. We suggest two alternative approaches, namely unsupervised probabilistic reasoning and supervised fine-tuning. The supervised variant proves more accurate, with a top-3 accuracy of up to 98%. We also show that the model – though trained on method calls' full contexts – is quite robust with respect to reducing context.ensmart autocompletiondeep learningtransformer networksBidirectional Transformer Language Models for Smart Autocompletion of Source Code10.18420/inf2020_831617-5468