RefBERT: A Two-Stage Pre-trained Framework for Automatic Rename Refactoring
Refactoring is an indispensable practice of improving the quality and maintainability of source code in software evolution. Rename refactoring is the most frequently performed refactoring that suggests a new name for an identifier to enhance readability when the identifier is poorly named. However, most existing works only identify renaming activities between two versions of source code, while few works express concern about how to suggest a new name. In this paper, we study automatic rename refactoring on variable names, which is considered more challenging than other rename refactoring activities. We first point out the connections between rename refactoring and various prevalent learning paradigms and the difference between rename refactoring and general text generation in natural language processing. Based on our observations, we propose RefBERT, a two-stage pre-trained framework for rename refactoring on variable names. RefBERT first predicts the number of sub-tokens in the new name and then generates sub-tokens accordingly. Several techniques, including constrained masked language modeling, contrastive learning, and the bag-of-tokens loss, are incorporated into RefBERT to tailor it for automatic rename refactoring on variable names. Through extensive experiments on our constructed refactoring datasets, we show that the generated variable names of RefBERT are more accurate and meaningful than those produced by the existing method. Our implementation and data are available at https://github.com/KDEGroup/RefBERT.
Tue 18 JulDisplayed time zone: Pacific Time (US & Canada) change
15:30 - 17:00 | ISSTA Online 1: SE and Deep LearningTechnical Papers at Smith Classroom (Gates G10) Chair(s): Myra Cohen Iowa State University | ||
15:30 10mTalk | COME: Commit Message Generation with Modification Embedding Technical Papers Yichen He Beihang University, Liran Wang Beihang University, Kaiyi Wang Beihang University, Yupeng Zhang Beihang University, Hang Zhang Beihang University, Zhoujun Li Beihang University DOI | ||
15:40 10mTalk | CODEP: Grammatical Seq2Seq Model for General-Purpose Code Generation Technical Papers DOI Pre-print | ||
15:50 10mTalk | Towards More Realistic Evaluation for Neural Test Oracle Generation Technical Papers DOI Pre-print | ||
16:00 10mTalk | Detecting Condition-Related Bugs with Control Flow Graph Neural Network Technical Papers Jian Zhang Beihang University, Xu Wang Beihang University, Hongyu Zhang Chongqing University, Hailong Sun Beihang University, Xudong Liu Beihang University, Chunming Hu Beihang University, Yang Liu Nanyang Technological University DOI | ||
16:10 10mTalk | RefBERT: A Two-Stage Pre-trained Framework for Automatic Rename Refactoring Technical Papers Hao Liu Xiamen University, Yanlin Wang Sun Yat-sen University, Zhao Wei Tencent, Yong Xu Tencent, Juhong Wang Tencent, Hui Li Xiamen University, Rongrong Ji Xiamen University DOI Pre-print | ||
16:20 10mTalk | Interpreters for GNN-Based Vulnerability Detection: Are We There Yet? Technical Papers Yutao Hu Huazhong University of Science and Technology, Suyuan Wang Huazhong University of Science and Technology, Wenke Li Huazhong University of Science and Technology, Junru Peng Wuhan University, Yueming Wu Nanyang Technological University, Deqing Zou Huazhong University of Science and Technology, Hai Jin Huazhong University of Science and Technology DOI | ||
16:30 10mTalk | Towards Efficient Fine-Tuning of Pre-trained Code Models: An Experimental Study and Beyond Technical Papers Ensheng Shi Xi’an Jiaotong University, Yanlin Wang Sun Yat-sen University, Hongyu Zhang Chongqing University, Lun Du Microsoft Research, Shi Han Microsoft Research, Dongmei Zhang Microsoft Research, Hongbin Sun Xi’an Jiaotong University DOI |