Zhize Li (李志泽)

I am now a Research Scientist at the King Abdullah University of Science and Technology (KAUST) since September 2020. Before that, I was a Postdoc at KAUST hosted by Prof. Peter Richtárik from September 2019 to September 2020.

I received my PhD in Computer Science from the Institute for Interdisciplinary Information Sciences of Tsinghua University in July 2019. My Advisor was Prof. Jian Li.

I was a visiting scholar at the Computer Science Department of Duke University (Hosted by Prof. Rong Ge) and a visiting scholar at the Industrial and Systems Engineering of Georgia Institute of Technology (Hosted by Prof. Guanghui (George) Lan).

My research interests lie in theoretical computer science and machine learning, in particular convex/nonconvex/distributed optimization, federated learning, machine learning, algorithms and data structures.

PhD thesis: Simple and Fast Optimization Methods for Machine Learning. June 2019.
2019 Tsinghua Outstanding Doctoral Dissertation Award.

Publications (by date) [by topic] [google scholar] [dblp]

  1. A Unified Analysis of Stochastic Gradient Methods for Nonconvex Federated Optimization

    Zhize Li, Peter Richtárik

    NeurIPS Workshop on Scalability, Privacy, and Security in Federated Learning (NeurIPS-SpicyFL 2020). (spotlight)

  2. PAGE: A Simple and Optimal Probabilistic Gradient Estimator for Nonconvex Optimization

    Zhize Li, Hongyan Bao, Xiangliang Zhang, Peter Richtárik

    NeurIPS Workshop on Optimization for Machine Learning (NeurIPS-OPT 2020). (oral)

  3. Acceleration for Compressed Gradient Descent in Distributed and Federated Optimization

    Zhize Li, Dmitry Kovalev, Xun Qian, Peter Richtárik

    37th International Conference on Machine Learning (ICML 2020) [slides]

  4. A Fast Anderson-Chebyshev Acceleration for Nonlinear Optimization

    Zhize Li, Jian Li

    23rd International Conference on Artificial Intelligence and Statistics (AISTATS 2020)

  5. A Unified Variance-Reduced Accelerated Gradient Method for Convex Optimization

    [alphabetical order] Guanghui Lan, Zhize Li, Yi Zhou

    33rd Conference on Neural Information Processing Systems (NeurIPS 2019) [slides]

  6. SSRGD: Simple Stochastic Recursive Gradient Descent for Escaping Saddle Points

    Zhize Li

    33rd Conference on Neural Information Processing Systems (NeurIPS 2019)

  7. Stochastic Gradient Hamiltonian Monte Carlo with Variance Reduction for Bayesian Inference

    Zhize Li, Tianyi Zhang, Shuyu Cheng, Jun Zhu, Jian Li

    Machine Learning, 2019, journal.

  8. Stabilized SVRG: Simple Variance Reduction for Nonconvex Optimization

    [alphabetical order] Rong Ge, Zhize Li, Weiyao Wang, Xiang Wang

    32nd Conference on Learning Theory (COLT 2019)

  9. Learning Two-layer Neural Networks with Symmetric Inputs

    [alphabetical order] Rong Ge, Rohith Kuditipudi, Zhize Li, Xiang Wang

    7th International Conference on Learning Representations (ICLR 2019)

  10. Gradient Boosting With Piece-Wise Linear Regression Trees

    Yu Shi, Jian Li, Zhize Li

    28th International Joint Conference on Artificial Intelligence (IJCAI 2019)

  11. A Simple Proximal Stochastic Gradient Method for Nonsmooth Nonconvex Optimization

    Zhize Li, Jian Li

    32nd Conference on Neural Information Processing Systems (NeurIPS 2018). (spotlight)

  12. Optimal In-Place Suffix Sorting

    Zhize Li, Jian Li, Hongwei Huo

    25th International Symposium on String Processing and Information Retrieval (SPIRE 2018). (invited)
    [long paper][slides].  (full version in arXiv was submitted to journal)
    Previous one-page [summary paper] appeared in 28th IEEE Data Compression Conference (DCC 2018)

    The suffix array (as a space-efficient alternative to suffix tree) is a fundamental data structure for many applications that involve string searching and data compression. Designing time/space-efficient suffix array construction algorithms has attracted significant attention and considerable advances have been made for the past 20 years. We give the first linear time in-place suffix array construction algorithm which is optimal both in time and space. In particular, our algorithm solves the important open problem posed by Franceschini and Muthukrishnan in ICALP 2007. Unfortunately, our paper has been prevented from being published since 2017 due to a malicious conflict of interest [a disclaimer].

  13. A Fast Polynomial-time Primal-Dual Projection Algorithm for Linear Programming

    Zhize Li, Wei Zhang, Kees Roos

    23rd International Symposium on Mathematical Programming (ISMP 2018) [full version] in submission

  14. A Two-Stage Mechanism for Ordinal Peer Assessment

    Zhize Li, Le Zhang, Zhixuan Fang, Jian Li

    11th International Symposium on Algorithmic Game Theory (SAGT 2018)

  15. On Top-k Selection in Multi-Armed Bandits and Hidden Bipartite Graphs

    Wei Cao, Jian Li, Yufei Tao, Zhize Li

    28th Conference on Neural Information Processing Systems (NIPS 2015)


Invited Talks/Posters

  1. China Theory Week 2018 (founded by Prof. Andrew Chi-Chih Yao in 2007) [slides]
  2. The 16th China Symposium on Machine Learning and Applications (MLA'18) [slides][poster]

Conference Talks/Posters

NeurIPS 2020, ICML 2020, AISTATS 2020, NeurIPS 2019, ECML 2019, COLT 2019, ICLR 2019, IJCAI 2019, NeurIPS 2018, SPIRE 2018, SAGT 2018, ISMP 2018, DCC 2018, NIPS 2015.

Selected Awards


Zhize Li

Email: zhizeli DOT thu AT gmail DOT com

zz-li14 AT mails DOT tsinghua DOT edu DOT cn

Address: Room 4-609, FIT Building, Tsinghua University, Beijing 100084, China