Zhize Li (李志泽)

I am now a Research Scientist at the King Abdullah University of Science and Technology (KAUST) since September 2020. Before that, I was a Postdoc at KAUST hosted by Prof. Peter Richtárik from September 2019 to September 2020.

I received my PhD in Computer Science from the Institute for Interdisciplinary Information Sciences of Tsinghua University in July 2019. My Advisor was Prof. Jian Li.

I was a visiting scholar at the Computer Science Department of Duke University (Hosted by Prof. Rong Ge) and a visiting scholar at the Industrial and Systems Engineering of Georgia Institute of Technology (Hosted by Prof. Guanghui (George) Lan) from January to September 2018.

My research interests lie in theoretical computer science and machine learning, in particular convex/nonconvex/distributed optimization, federated learning, machine learning, algorithms and data structures.

PhD thesis: Simple and Fast Optimization Methods for Machine Learning. June 2019.
2019 Tsinghua Outstanding Doctoral Dissertation Award.


Remote Interns/Students under my supervision

  1. Haoyu Zhao (first year PhD student at Princeton University)


Publications (by year) [google scholar] [dblp]

  1. EF21 with Bells & Whistles: Practical Algorithmic Extensions of Modern Error Feedback

    Ilyas Fatkhullin, Igor Sokolov, Eduard Gorbunov, Zhize Li, Peter Richtárik

    arXiv:2110.03294, 2021.

  2. DESTRESS: Computation-Optimal and Communication-Efficient Decentralized Nonconvex Finite-Sum Optimization

    Boyue Li, Zhize Li, Yuejie Chi

    arXiv:2110.01165, 2021.

  3. ANITA: An Optimal Loopless Accelerated Variance-Reduced Gradient Method

    Zhize Li

    arXiv:2103.11333, 2021.

  4. ZeroSARAH: Efficient Nonconvex Finite-Sum Optimization with Zero Full Gradient Computation

    Zhize Li, Slavomír Hanzely, Peter Richtárik

    arXiv:2103.01447, 2021.

  5. CANITA: Faster Rates for Distributed Convex Optimization with Communication Compression

    Zhize Li, Peter Richtárik

    35th Conference on Neural Information Processing Systems (NeurIPS 2021)

  6. Optimal In-Place Suffix Sorting

    Zhize Li, Jian Li, Hongwei Huo

    Information and Computation, 2021. (CCF rank A journal in theory)

  7. PAGE: A Simple and Optimal Probabilistic Gradient Estimator for Nonconvex Optimization

    Zhize Li, Hongyan Bao, Xiangliang Zhang, Peter Richtárik

    38th International Conference on Machine Learning (ICML 2021). (long talk 166/5513 = 3%)

    Previously appeared in NeurIPS Workshop on Optimization for Machine Learning (NeurIPS-OPT 2020). (oral)

  8. MARINA: Faster Non-Convex Distributed Learning with Compression

    Eduard Gorbunov, Konstantin Burlachenko, Zhize Li, Peter Richtárik

    38th International Conference on Machine Learning (ICML 2021)

  9. A Unified Analysis of Stochastic Gradient Methods for Nonconvex Federated Optimization

    Zhize Li, Peter Richtárik

    Submitted to Journal of Machine Learning Research (JMLR), under revision.

    NeurIPS Workshop on Scalability, Privacy, and Security in Federated Learning (NeurIPS-SpicyFL 2020). (spotlight)

  10. Acceleration for Compressed Gradient Descent in Distributed and Federated Optimization

    Zhize Li, Dmitry Kovalev, Xun Qian, Peter Richtárik

    37th International Conference on Machine Learning (ICML 2020) [slides]

  11. A Fast Anderson-Chebyshev Acceleration for Nonlinear Optimization

    Zhize Li, Jian Li

    23rd International Conference on Artificial Intelligence and Statistics (AISTATS 2020)

  12. A Unified Variance-Reduced Accelerated Gradient Method for Convex Optimization

    [alphabetical order] Guanghui Lan, Zhize Li, Yi Zhou

    33rd Conference on Neural Information Processing Systems (NeurIPS 2019) [slides]

  13. SSRGD: Simple Stochastic Recursive Gradient Descent for Escaping Saddle Points

    Zhize Li

    33rd Conference on Neural Information Processing Systems (NeurIPS 2019)

  14. Stochastic Gradient Hamiltonian Monte Carlo with Variance Reduction for Bayesian Inference

    Zhize Li, Tianyi Zhang, Shuyu Cheng, Jun Zhu, Jian Li

    Machine Learning, 2019, journal.

  15. Stabilized SVRG: Simple Variance Reduction for Nonconvex Optimization

    [alphabetical order] Rong Ge, Zhize Li, Weiyao Wang, Xiang Wang

    32nd Conference on Learning Theory (COLT 2019)

  16. Learning Two-layer Neural Networks with Symmetric Inputs

    [alphabetical order] Rong Ge, Rohith Kuditipudi, Zhize Li, Xiang Wang

    7th International Conference on Learning Representations (ICLR 2019)

  17. Gradient Boosting With Piece-Wise Linear Regression Trees

    Yu Shi, Jian Li, Zhize Li

    28th International Joint Conference on Artificial Intelligence (IJCAI 2019)

  18. A Simple Proximal Stochastic Gradient Method for Nonsmooth Nonconvex Optimization

    Zhize Li, Jian Li

    32nd Conference on Neural Information Processing Systems (NeurIPS 2018). (spotlight)

  19. A Fast Polynomial-time Primal-Dual Projection Algorithm for Linear Programming

    Zhize Li, Wei Zhang, Kees Roos

    23rd International Symposium on Mathematical Programming (ISMP 2018) [full version] in submission

  20. A Two-Stage Mechanism for Ordinal Peer Assessment

    Zhize Li, Le Zhang, Zhixuan Fang, Jian Li

    11th International Symposium on Algorithmic Game Theory (SAGT 2018)

  21. Optimal In-Place Suffix Sorting

    Zhize Li, Jian Li, Hongwei Huo

    25th International Symposium on String Processing and Information Retrieval (SPIRE 2018) [slides] (invited)
    Previous one-page [summary paper] appeared in 28th IEEE Data Compression Conference (DCC 2018)

    The suffix array (as a space-efficient alternative to suffix tree) is a fundamental data structure for many applications that involve string searching and data compression. Designing time/space-efficient suffix array construction algorithms has attracted significant attention and considerable advances have been made for the past 20 years. We give the first linear time in-place suffix array construction algorithm which is optimal both in time and space. In particular, our algorithm solves the important open problem posed by Franceschini and Muthukrishnan in ICALP 2007. Unfortunately, our paper has been prevented from being published since 2017 due to a malicious conflict of interest [a disclaimer].

  22. On Top-k Selection in Multi-Armed Bandits and Hidden Bipartite Graphs

    Wei Cao, Jian Li, Yufei Tao, Zhize Li

    28th Conference on Neural Information Processing Systems (NIPS 2015)


Experience/Services


Conference Talks/Posters

ICML 2021, NeurIPS 2020, ICML 2020, AISTATS 2020, NeurIPS 2019, ECML 2019, COLT 2019, ICLR 2019, IJCAI 2019, NeurIPS 2018, SPIRE 2018, SAGT 2018, ISMP 2018, DCC 2018


Other Talks/Posters

  1. China Theory Week 2018 (founded by Prof. Andrew Chi-Chih Yao in 2007), 2018
  2. The 16th China Symposium on Machine Learning and Applications (MLA'18), 2018
  3. KAUST-Tsinghua-Industry Workshop on AI, 2019
  4. SIAM Conference on Optimization (OP20), 2020. cancelled due to Covid-19
  5. The University of Manchester, 2021
  6. Harvard University, 2021

Selected Awards


Contact

Zhize Li

Email: zhizeli DOT thu AT gmail DOT com

zz-li14 AT mails DOT tsinghua DOT edu DOT cn

Address: Room 4-609, FIT Building, Tsinghua University, Beijing 100084, China