My research interests span software engineering, machine learning, and program analysis
. My primary research interests are:
- Large Language Model of Code
- Software Text Analytics for Software Automation
- Understand and Assist Software Practice
March, 2023: I will serve ICSE 2024 as a PC of Demonstrations Track. Please consider submitting your work!
March, 2023: I delivered a lecture as a guest instructor on code embedding for the graduate students from Dalhousie University.
March, 2023: One work titled Generation-based Code Review Automation: How Far Are We? is accepted to appear at the Research track of the 31st IEEE/ACM International Conference on Program Comprehension (ICPC 2023).
January, 2023: One work titled TECHSUMBOT: A Stack Overflow Answer Summarization Tool for Technical Query is accepted to appear at the Demostractions track of the 45th IEEE/ACM International Conference on Software Engineering (ICSE 2023).
December, 2022: One work titled Curiosity-Driven and Victim-Aware Adversarial Policies is accepted by The Annual Computer Security Applications Conference and won Honorable Mention Award.
November, 2022: One work titled Duplicate Bug Report Detection: How Far Are We? is accepted by ACM Transactions on Software Engineering and Methodology.
July, 2022: Two works titled Compressing Pre-trained Models of Code into 3 MB and Answer Summarization for Technical Queries: Benchmark and New Approach (Technical track) are accepted by ASE'22.
June, 2022: One work titled How to Better Utilize Code Graphs in Semantic Code Search? (Technical track) is accepted by ESEC/FSE'22.
June, 2022: [Call For Paper] I am one of the organizers of the ESEC/FSE workshop the 6th edition of the MaLTeSQuE on machine learning techniques for software quality evolution. Please submit your great work(s)!
March, 2022: One work titled PTM4Tag: Sharpening Tag Recommendation of Stack Overflow with Pre-trained Models (Technical track) is accepted by ICPC'22.
December, 2021: Two works titled Aspect-Based API Review Classification: How Far Can Pre-Trained Transformer Model Go? (Technical track) and Can Identifier Splitting Improve Open-Vocabulary Language Model of Code? (ERA track) are accepted by SANER'22.
August, 2021: I graduated from SMU, a beautiful university carries lots of great memories!
June, 2021: One work titled Post2Vec: Learning Distributed Representations of Stack Overflow Posts is accepted by IEEE Transactions on Software Engineering.
Bowen Xu is a Research Scientist (Post-Doc) in the School of Computing and Information Systems (SCIS) at Singapore Management University (SMU). He got his PhD in 2022 from SCIS at SMU, supervised by Prof. David Lo (IEEE Fellow). His research area is software engineering. Particularly, his research interests lie in machine learning and software engineering. The aims of his works are to improve software quality and developers’ productivity. He has published his works in both top-tier software engineering conferences (ICSE, FSE, ASE) and journals (TSE, TOSEM, EMSE). One of his works is nominated for ACM SIGSOFT Distinguished Paper Award (top 10% among all accepted submissions) for ASE 2022 (Core A*). Besides, another piece of his works published in ESEM 2018 (Core A) has won the Highly Commended Full Paper Award (the 2nd best among all accepted submissions). Two of his works published in ICPC 2022 (Core A) and MSR 2016 (Core A) with high scores have been invited extended to the journal EMSE.