Short Bio

Qi Lei is an associate research scholar at the ECE department of Princeton University. She received her Ph.D. from Oden Institute for Computational Engineering & Sciences at UT Austin. She visited the Institute for Advanced Study (IAS)/Princeton for the Theoretical Machine Learning Program. Before that, she was a research fellow at Simons Institute for the Foundations of Deep Learning Program. Her research aims to develop sample- and computationally efficient machine learning algorithms and bridge the theoretical and empirical gap in machine learning. Qi has received several awards, including the Outstanding Dissertation Award, National Initiative for Modeling and Simulation Graduate Research Fellowship, Computing Innovative Fellowship, and Simons-Berkeley Research Fellowship.

Research Interests:

Modern Machine Learning (ML) models are transforming applications across various domains. Pushing the limits of their potential relies on training more complex models, using larger data sets, and persistent hyper-parameter tuning. This procedure requires sophisticated user experience, expensive equipment such as GPU machines, and extensive label annotations costs. These criteria leave machine learning exclusive to only specialized researchers and institutes. I aim to make machine learning more accessible to the general populace by developing efficient and easily trainable machine learning algorithms with low computational cost, fewer security concerns, and low requirement of labeled data. Over the past seven years, I have focused on bringing more theoretical ideas and principles to algorithm design towards efficient, robust, and few-shot machine learning algorithms.

Curriculum Vitae:

(Curriculum Vitae, Github, Google Scholar)


Unversity of Texas at Austin, Austin, TX

Ph.D student in Institute for Computational Engineering and Sciences       August 2014 - May 2020

Institute of Advanced Study, Princeton, NJ

Visiting Graduate Student for the Special Year on Optimization, Statistics and Theoretical Machine Learning        September 2019 - May 2020

Simons Institute, Berkeley, CA

Research Fellow for the Foundations of Deep Learning Program       May 2019 - August 2019

Zhejiang University, Zhejiang, China

B.S. in Mathematics        August 2010 - May 2014

Awards and Recognitions

  • Computing Innovation Fellowship, CRA, 2020-2022

  • Simons-Berkeley Research Fellowship Simons Institute, 2019

  • The National Initiative for Modeling and Simulation Research Fellowship ($225K for four years) UT Austin, 2014-2018

  • Young Investigators Lecturer award, Caltech, 2021

  • Outstanding Dissertation Award, Oden Institute, 2021

  • Rising Star for EECS (An Academic Career Workshop for Women in EECS), UIUC, 2019; MIT, 2021

  • Rising Star for Machine Learning, University Maryland, 2021

  • Rising Star for Computational and Data Science, UT Austin, 2020

  • Meritorious Winner (First Prize) for The Mathematical Contest in Modeling (MCM) COMAP, 2014

  • Gold medal (5th place) in China Girls Math Olympiad (CGMO, an international competition with a proof-based format similar to the International Math Olympiad), 2009

  • First Prize for CMC (the Mathematics competition of Chinese College Student), China, 2012

  • First Prize for National Olympiad in Informatics in Provinces (NOIP), China, 2007(perfect score), 2008


Facebook Visual Understanding Team, Menlo Park, CA

Software Engineering Intern        June 2018 - September 2018

Amazon/A9 Product Search Lab, Palo Alto, CA

Software Development Intern, Search Technologies       May 2017 - August 2017

Amazon Web Services (AWS) Deep Learning Team, Palo Alto, CA

Applied Scientist Intern        January 2017 - April 2017

IBM Thomas J. Watson Research Center, Yorktown Heights, NY

Research Summer Intern        May 2016 - October 2016

UCLA Biomath Department, Los Angeles, CA

Visiting Student        July 2013 - September 2013

Invited Talk

‘‘Optimal Gradient-based Algorithms for Non-concave Bandit Optimization."

‘‘ Few-Shot Learning via Learning the Representation, Provably.’’

‘‘Predicting What You Already Know Helps: Provable Self-Supervised Learning.’’

‘‘Provable representation learning.’’

‘‘SGD Learns One-Layer Networks in WGANs.’’

‘‘Deep Generative models and Inverse Problems.’’

‘‘Similarity Preserving Representation Learning for Time Series Analysis.’’

‘‘Discrete Adversarial Attacks and Submodular Optimization with Applications to Text Classification.’’

‘‘Recent Advances in Primal-Dual Coordinate Methods for ERM.’’

‘‘Coordinate Descent Methods for Matrix Factorization.’’


Conference Reviewer: MLSys (19,20,Meta-reviewer’21, TPC’22), COLT (21,22), STOC (20), NeurIPS (16,17,18,19,20,21), ICML (18,19,20,21), ICLR (18,19,20,21), AISTATS (18,19,20,21), AAAI (20,21), ACML (19), and more

Journal Reviewer: JSAIT(20), MOR (18,19,20), TNNLS (19,20), TKDE (19), ISIT (17,18), TIIS (17), IT (16,17), and more


Theory of Deep Learning: Representation and Weakly Supervised Learning, Teaching Assistant, Fall 2020

Scalable Machine Learning, Teaching Assistant, Fall 2019

Mathematical Methods in Applied Engineering and Sciences, Instructer Intern, Spring 2016