Pingzhi Li

Pingzhi Li

pingzhi@cs.unc.edu

I'm a doctoral candidate of computer science at UNC-Chapel Hill, advised by Prof. Tianlong Chen. I work with Apple Foundation Models team as a research intern. I earned my bachelor degree in computer science from USTC.

I like simple and useful things. I am broadly interested in efficient AI computing and its application to scientific discovery. I also enjoy building systems and putting thoughts into reality.

To find out more, you can check out my Twitter, Last.fm, Instagram, or Blogs.

Selected Publications

(* = equal contribution)

  • P. Li, B. Hou, Y. Zhu, Y. Feng, K. Ye, T. Lei, Z. Chen, T. Chen and X. Du, "Adaptive Thinking: Large Language Models Know When to Think in Latent Space."
    ICLR 2026
  • S. Luo*, Y. Han*, P. Li*, J. Qin*, J. Peng, Y.K. Zhao, Y. Cao and T. Chen, "Mozart: Modularized and Efficient MoE Training on 3.5D Wafer-Scale Chiplet Architectures."
    NeurIPS 2025 (Spotlight) / Code
  • S. Luo, P. Li, J. Peng, H. Wang, Y. Cheng and T. Chen, "Occult: Optimizing Collaborative Communication across Experts for Accelerated Parallel MoE Training and Inference."
    ICML 2025 / Code
  • M. Zhang*, P. Li*, J. Peng, M. Qiu and T. Chen, "Advancing MoE Efficiency: A Collaboration-Constrained Routing Strategy for Better Expert Parallelism Design."
    NAACL 2025 (SAC Award) / Code
  • Y. Zhang*, P. Li*, J. Hong*, J. Li*, Y. Zhang, W. Zheng, P.Y. Chen, J.D. Lee, W. Yin, M. Hong, Z. Wang, S. Liu and T. Chen, "Revisiting Zeroth-Order Optimization for Memory-Efficient LLM Fine-Tuning: A Benchmark."
    ICML 2024 / Code / Tutorial
  • P. Li, Z. Zhang, P. Yadav, Y.L. Sung, Y. Cheng, M. Bansal and T. Chen, "Merge, Then Compress: Demystify Efficient SMoE with Hints from Its Routing Policy."
    ICLR 2024 (Spotlight) / Code

Awards

Resume

Here is my Resume (last updated Mar. 2026).