Gongfan Fang
Ph.D. Candidate | xML Lab | National University of Singapore.

Hi there! I'm Gongfan Fang, currently a last-year Ph.D. Candidate at the xML Lab @ National University of Singapore, under the supervision of Prof. Xinchao Wang (Presidential Young Professor) . Before joining xML Lab, I earned my Bachelor’s degree in 2019 and completed my Master’s degree in 2022 at the Visual Intelligence and Pattern Analysis (VIPA) Lab @ Zhejiang University, adviced by Prof. Mingli Song.
My research focuses on LLMs, Diffusion and Efficient Generative Models. My previous work includes Hybrid Reasoning LLM, Efficient LLM, and Fast Diffusion Transformers. I'm also actively contributing to several projects such as Torch-Pruning, a top framework for compressing foundation models.
News
Feb, 2025 | 🍺 One first-author paper TinyFusion and three co-authored papers were accepted by CVPR’25. |
---|---|
Dec, 2024 | 🎵 I’m deeply honored to be awarded the 2024 ByteDance Scholarship (10~15 recipients per year). |
Sep, 2024 | 🚀 Two first-author papers MaskLLM (Spotlight) and Remix-DiT were accepted by NeurIPS’24. |
Selected Publications
-
NeurIPS’24 MaskLLM: Learnable Semi-structured Sparsity for Large Language ModelsAdvances in Neural Information Processing Systems, 2024NVIDIA Research, National University of SingaporeNeurIPS’24 Spotlight (2%) | Pre-training of Sparse LLM | The First Scalable Algorithm for N:M Sparsity in LLMs | 1.4x Faster with 30%+ Memory Saving
Education
2022.07 - 2026.06 - Ph.D. in Electrical and Computer Engineering, National University of Singapore.
2019.09 - 2022.04 - M.Eng. in Computer Science, College of Computer Science and Technology, Zhejiang University.
2015.09 - 2019.06 - B.S. in Computer Science, College of Computer Science and Technology, Zhejiang University.