Gongfan Fang

Ph.D. Student | Learning and Vision Lab | National University of Singapore.

avatar2.png

Hello there! I'm Gongfan Fang, currently a second-year Ph.D. student at the Learning and Vision (LV) Lab @ National University of Singapore, advised by Professor Xinchao Wang. Prior to joining LV Lab, I obtained my Bachelor's degree in 2019 and completed my Master's degree in 2022 at the Visual Intelligence and Pattern Analysis (VIPA) Lab @ Zhejiang University, where I worked under the supervision of Professor Mingli Song.

My research interests revolve around practical algorithms for efficient network training and inference. I'm also developing and maintaining several open-source projects on GitHub, including Torch-Pruning GitHub stars, Pytorch-MSSSIM GitHub stars, and DeepLabV3Plus-Pytorch GitHub stars.




News

Feb, 2024 DeepCache was accepted by CVPR’24. A training-free method for diffusion model acceleration.
Dec, 2023 New project SlimSAM, 0.1% Data Makes Segment Anything Slim.
Sep, 2023 Two papers LLM-Pruner & Diff-Pruning accepted by NeurIPS’23
Feb, 2023 One paper “DepGraph: Towards Any Structural Pruning” accepted by CVPR’23

Selected Publications

  1. fang2023depgraph.png
    CVPR’23
    DepGraph: Towards Any Structural Pruning
    Gongfan Fang, Xinyin Ma, Mingli Song, Michael Bi Mi, and Xinchao Wang
    Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2023
  2. ma2023deepcache.png
    CVPR’24
    DeepCache: Accelerating Diffusion Models for Free
    Xinyin Ma, Gongfan Fang, and Xinchao Wang
    Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, 2024
  3. ma2023llm_pruner.png
    NeurIPS’23
    LLM-Pruner: On the Structural Pruning of Large Language Models
    Xinyin Ma, Gongfan Fang, and Xinchao Wang
    Advances in neural information processing systems, 2023
  4. fang2023structural.png
    NeurIPS’23
    Structural Pruning for Diffusion Models
    Gongfan Fang, Xinyin Ma, and Xinchao Wang
    Advances in Neural Information Processing Systems, 2023
  5. fang2022up.png
    AAAI’22
    Up to 100x Faster Data-free Knowledge Distillation
    Gongfan Fang, Kanya Mo, Xinchao Wang, Jie Song, Shitao Bei, Haofei Zhang, and Mingli Song
    Proceedings of the AAAI Conference on Artificial Intelligence, 2022
  6. fang2021mosaicking.png
    NeurIPS’21
    Mosaicking to Distill: Knowledge Distillation from Out-of-Domain Data
    Gongfan Fang, Yifan Bao, Jie Song, Xinchao Wang, Donglin Xie, Chengchao Shen, and Mingli Song
    Advances in Neural Information Processing Systems, 2021
  7. fang2021contrastive.png
    IJCAI’21
    Contrastive Model Inversion for Data-free Knowledge Distillation
    Gongfan Fang, Jie Song, Xinchao Wang, Chengchao Shen, Xingen Wang, and Mingli Song
    Proceedings of International Joint Conference on Artificial Intelligence, 2021
  8. fang2019data.png
    Preprint’19
    Data-free Adversarial Distillation
    Gongfan Fang, Jie Song, Chengchao Shen, Xinchao Wang, Da Chen, and Mingli Song
    arXiv preprint arXiv:1912.11006, 2019

Education

2022.07 - Present - Ph.D. in Electrical and Computer Engineering, National University of Singapore.

2019.09 - 2022.04 - M.Eng. in Computer Science, College of Computer Science and Technology, Zhejiang University.

2015.09 - 2019.06 - B.S. in Computer Science, College of Computer Science and Technology, Zhejiang University.