The Download link is Generated: Download https://icml.cc/media/icml-2020/Slides/6645.pdf


Learning and Evaluating Contextual Embedding of Source Code



PRE-TRAINED CONTEXTUAL EMBEDDING OF SOURCE CODE

further shown that pre-trained contextual embeddings can be extremely powerful embedding of source code by training a BERT model on source code.



Learning and Evaluating Contextual Embedding of Source Code

Q1: How do contextual embeddings compare against word embeddings? CuBERT outperforms BiLSTM models initialized with pre-trained source-code-specific 



Learning and Evaluating Contextual Embedding of Source Code

A significant ad- vancement in natural-language understanding has come with the development of pre-trained con- textual embeddings such as BERT



What do pre-trained code models know about code?

25 août 2021 In order to determine whether the pre- trained vector embeddings of source code transformer models reflect code understanding in terms of ...



What Do They Capture? - A Structural Analysis of Pre-Trained

14 fév. 2022 Recently many pre-trained language models for source code have ... is embedded in the linear-transformed contextual word em-.



Contextual Embeddings for Arabic-English Code-Switched Data

12 déc. 2020 open source trained bilingual contextual word embedding models of FLAIR ... (2017) implemented Arabic pre-trained word embedding models.



Multi-task Learning based Pre-trained Language Model for Code

29 déc. 2020 [24] extended this idea to programming language understanding tasks. They derived contextual embedding of source code by training a BERT model ...