Zixuan Ke

I am a research scientist at Salesforce AI Research. I earned my Ph.D. degree at the University of Illinois, Chicago, where I was fortunate to be advised by Bing Liu (we continue to work closely). Prior to that, I received my M.Sc. in Computer Science from the University of Texas, Dallas, under the guidance of Vincent Ng. During the summers, I was a research intern at Google DeepMind, Meta AI, and Amazon Science.

Email  /  Google Scholar  /  Github  /  Twitter  /  LinkedIn  /  Blog

If you'd like to chat with me about research or anything, please feel free to reach out via email or schedule a chat here. I'd be happy to connect!

profile photo

πŸ“° News

πŸ“… June 2025
Latest preprint on Multi-agent System (MAS) and reasoning! MAS-Zero: Designing Multi-Agent Systems with Zero Supervision. Explore 1,000+ Discovered MAS Designs in the collection!
πŸ“… May 2025
Tutorial at NAACL 2025: Adaptation of Large Language Models. See our webpage for recording, agenda, slides, and more!
πŸ“… January 2025
Preprint on domain-adaptive post-training: Demystifying Domain-Adaptive Post-Training for Financial LLMs.

πŸ“š Selected Publications & Preprints

Full list on Google Scholar. (* indicates equal contribution)

Large Language Models

Demystifying Domain-adaptive Post-training for Financial LLMs
Zixuan Ke, Yifei Ming, Xuan-Phi Nguyen, Caiming Xiong, Shafiq Joty
arXiv, 2025
arxiv | data (FinEval)

Bridging the Preference Gap between Retrievers and LLMs
Zixuan Ke, Weize Kong, Cheng Li, Mingyang Zhang, Qiaozhu Mei, Michael Bendersky
ACL, 2024
arxiv | talk | poster

Continual Pre-training of Language Models
Zixuan Ke*, Yijia Shao*, Haowei Lin*, Tatsuya Konishi, Gyuhak Kim, Bing Liu
ICLR, 2023
arxiv | poster | model | code

Adapting a Language Model While Preserving its General Knowledge
Zixuan Ke, Yijia Shao, Haowei Lin, Hu Xu, Lei Shu, Bing Liu
EMNLP, 2022a
arxiv | poster | code

Continual Training of Language Models for Few-Shot Learning
Zixuan Ke, Haowei Lin, Yijia Shao, Hu Xu, Lei Shu, Bing Liu
EMNLP, 2022b
arxiv | poster | model | code

Continual Learning

Sub-network Discovery and Soft-masking for Continual Learning of Mixed Tasks
Zixuan Ke, Bing Liu, Wenhan Xiong, Asli Celikyilmaz, Haoran Li
EMNLP, 2023
arxiv | code

A Theoretical Study on Solving Continual Learning
Gyuhak Kim, Changnan Xiao, Zixuan Ke, Bing Liu
NeurIPS, 2022
arxiv

Achieving Forgetting Prevention and Knowledge Transfer in Continual Learning
Zixuan Ke, Bing Liu, Nianzu Ma, Hu Xu, Lei Shu
NeurIPS, 2021
arxiv | talk | poster | code

Continual Learning of A Mixed Sequence of Similar and Dissimilar Tasks
Zixuan Ke, Bing Liu, Xingchang Huang
NeurIPS, 2020
arxiv | talk | poster | code

Recent Talks & Classes

  • Adaptation of Large Lanaguge Model, Tutorial at NAACL25 (recording), New Mexico, May 3, 2025.
  • Continual Learning in NLP (slides), Tutorial at DEIM23, Remote, March 6, 2023.
  • Lifelong and Continual Learning (Part 1, Part 2). A Short PhD Course (8 hours), Aalborg University, June 14-16, 2022.
  • Conference talks (please refer to the Selected Publications section, and you can find more here)

Research Services

  • Area Chair/Action Editor (2024-):
    • ARR

  • Program Committee/Reviewer (2021-):
    • ICLR, NeurIPS, ICML, ACL, EMNLP, NAACL, IJCAI, ARR, COLING, Collas, NLPCC

  • Journal Reviewer (2021-):
    • TPAMI, TKDE, Neural Networks, Neurocomputing, Artificial Intelligence, TALLIP

Awards

  • Exceptional Research Premise (the highest honor for CoE PhD students at UIC), 2023

Collaborators
I have had the privilege of working with and learning from great mentors and mentees, including:

  • Mentors:
    • Bing Liu, distinguished professor at UIC
    • Hu Xu, research scientist at Facebook AI Research (FAIR)
    • Lei Shu, research scientist at Google Research

  • Mentees:
    (They're making great achievements and I couldn't be more thrilled and proud of them)
    • Yijia Shao, BS at Peking University ->PhD at Standford












Template modified from here.