People
To Prospective Students
There remains a significant gap between what modern AI systems can do and what we truly understand about them. My research group aims to help close this gap from two directions:
- Bottom-up: building theoretical foundations and first principles for AI.
- Top-down: analyzing state-of-the-art methods and propose theory-inspired new algorithms.
To me, achieving concrete understanding takes precedence over tuning black-box models for improvements. Coming from a statistics background, I believe statistics — along with other theoretical disciplines — has much to contribute to this effort. A key step in this process is conducting solid observational studies and formulating novel problems and hypotheses. Well-formulated toy settings can serve as powerful bridges between theory and practice.
I am actively seeking highly motived PhD/MPhil students and research assistants to join my group. Ideal candidates will have a strong theoretical foundation (in statistics, computer science, math, etc.) and hands-on experience in AI. If you want to contribute to this effort, please send me an email with your CV.
Former Members:
Reseach Interns at Huawei Noah’s Ark Lab.
-
Yimeng Chen, worked on model ensemble for domain generalization (ICML 2023), while pursuing a PhD at the University of Chinese Academy of Sciences.
-
Paweł Piwek, worked on boundary complexity for ReLU classifiers (UAI 2023), while pursuing a PhD at the University of Oxford.
-
Yi Liu, worked on optimal latent space for generative modeling, while pursuing a BS at CUHK-Shenzhen.
-
Weijian Luo, worked on diffusion distillation (NeurIPS 2023), while pursuing a PhD at Peking University.
-
Jiajun Ma, worked on diffusion sampling acceleration (ICLR 2024) and classifier guidance (ICML 2024), while pursuing a PhD at HKUST-Guangzhou.
-
Xuantong Liu, worked on conditional generation via model inversion (ICML 2024), while pursuing a PhD at HKUST.
-
Jakub Wornbard, worked on the duality between prompts and model weights in Transformers, while pursuing a MS at the University of Cambridge.
-
Shuchen Xue, worked on diffusion sampling acceleration (ICML 2024, CVPR 2024) and discrete diffusion models, while pursuing a PhD at the University of Chinese Academy of Sciences.
-
Yihong Luo, worked on diffusion distillation (ICLR 2025, arXiv 2025) and controllable generation (arXiv 2025, arXiv 2025), while pursuing a PhD at HKUST.