Zhize Li (李志泽)         Google Scholar dblp


I am now a tenure-track Assistant Professor in the School of Computing and Information Systems at Singapore Management University since 2023. Before that, I was a Research Scientist at Carnegie Mellon University working with Prof. Yuejie Chi.

Previously, I was a Research Scientist at King Abdullah University of Science and Technology (KAUST) and a Postdoctoral Research Fellow at KAUST (hosted by Prof. Peter Richtárik), a Visiting Scholar at Duke University (hosted by Prof. Rong Ge), and a Visiting Scholar at Georgia Institute of Technology (hosted by Prof. Guanghui (George) Lan).

I received my PhD in Computer Science from the Institute for Interdisciplinary Information Sciences at Tsinghua University in 2019. My Advisor was Prof. Jian Li.

My research interests lie in optimization, federated learning, AI privacy, and machine learning, in particular large-scale/distributed/decentralized optimization, and private/efficient/resilient federated learning.

I am looking for fully-funded PhD students, (CSC) visiting students, and research assistants/research engineers, please feel free to drop me an email if you are interested!


PhD Students


News


Publications (by year) [google scholar] [dblp]

  1. Faster Rates for Compressed Federated Learning with Client-Variance Reduction

    Haoyu Zhao, Konstantin Burlachenko, Zhize Li\(^\dagger\), Peter Richtárik.   (\(^\dagger\)corresponding author)

    SIAM Journal on Mathematics of Data Science (SIMODS), 2023.

  2. BEER: Fast O(1/T) Rate for Decentralized Nonconvex Optimization with Communication Compression

    Haoyu Zhao, Boyue Li, Zhize Li\(^\dagger\), Peter Richtárik, Yuejie Chi.

    36th Conference on Neural Information Processing Systems (NeurIPS 2022)

  3. SoteriaFL: A Unified Framework for Private Federated Learning with Communication Compression

    Zhize Li, Haoyu Zhao, Boyue Li, Yuejie Chi

    36th Conference on Neural Information Processing Systems (NeurIPS 2022)

  4. Coresets for Vertical Federated Learning: Regularized Linear Regression and K-Means Clustering

    [\(^*\)alphabetical order] Lingxiao Huang\(^*\), Zhize Li\(^*\), Jialin Sun\(^*\), Haoyu Zhao\(^*\)

    36th Conference on Neural Information Processing Systems (NeurIPS 2022)

  5. Simple and Optimal Stochastic Gradient Methods for Nonsmooth Nonconvex Optimization

    Zhize Li, Jian Li

    Journal of Machine Learning Research (JMLR), 2022.

  6. DESTRESS: Computation-Optimal and Communication-Efficient Decentralized Nonconvex Finite-Sum Optimization

    Boyue Li, Zhize Li, Yuejie Chi

    SIAM Journal on Mathematics of Data Science (SIMODS), 2022.

    Previously appeared in NeurIPS Workshop on Optimization for Machine Learning (NeurIPS-OPT 2021)

  7. 3PC: Three Point Compressors for Communication-Efficient Distributed Training and a Better Theory for Lazy Aggregation

    Peter Richtárik, Igor Sokolov, Ilyas Fatkhullin, Elnur Gasanov, Zhize Li, Eduard Gorbunov

    39th International Conference on Machine Learning (ICML 2022)

  8. Optimal In-Place Suffix Sorting

    Zhize Li, Jian Li, Hongwei Huo

    Information and Computation, 2022. (CCF rank A journal in theory)

  9. CANITA: Faster Rates for Distributed Convex Optimization with Communication Compression

    Zhize Li, Peter Richtárik

    35th Conference on Neural Information Processing Systems (NeurIPS 2021) [slides]

  10. EF21 with Bells & Whistles: Practical Algorithmic Extensions of Modern Error Feedback

    Ilyas Fatkhullin, Igor Sokolov, Eduard Gorbunov, Zhize Li\(^\dagger\), Peter Richtárik

    NeurIPS 2021 Workshop on Optimization for Machine Learning (NeurIPS-OPT 2021)

  11. PAGE: A Simple and Optimal Probabilistic Gradient Estimator for Nonconvex Optimization

    Zhize Li, Hongyan Bao, Xiangliang Zhang, Peter Richtárik

    38th International Conference on Machine Learning (ICML 2021). (long talk 166/5513 = 3%)

    Previously appeared in NeurIPS Workshop on Optimization for Machine Learning (NeurIPS-OPT 2020). (oral)

  12. MARINA: Faster Non-Convex Distributed Learning with Compression

    Eduard Gorbunov, Konstantin Burlachenko, Zhize Li, Peter Richtárik

    38th International Conference on Machine Learning (ICML 2021)

  13. A Unified Analysis of Stochastic Gradient Methods for Nonconvex Federated Optimization

    Zhize Li, Peter Richtárik

    Submitted to Journal of Machine Learning Research (JMLR), under revision.

    NeurIPS Workshop on Scalability, Privacy, and Security in Federated Learning (NeurIPS-SpicyFL 2020). (spotlight)

  14. Acceleration for Compressed Gradient Descent in Distributed and Federated Optimization

    Zhize Li, Dmitry Kovalev, Xun Qian, Peter Richtárik

    37th International Conference on Machine Learning (ICML 2020) [slides]

  15. A Fast Anderson-Chebyshev Acceleration for Nonlinear Optimization

    Zhize Li, Jian Li

    23rd International Conference on Artificial Intelligence and Statistics (AISTATS 2020)

  16. A Unified Variance-Reduced Accelerated Gradient Method for Convex Optimization

    [\(^*\)alphabetical order] Guanghui Lan\(^*\), Zhize Li\(^*\), Yi Zhou\(^*\)

    33rd Conference on Neural Information Processing Systems (NeurIPS 2019)

  17. SSRGD: Simple Stochastic Recursive Gradient Descent for Escaping Saddle Points

    Zhize Li

    33rd Conference on Neural Information Processing Systems (NeurIPS 2019)

  18. Stochastic Gradient Hamiltonian Monte Carlo with Variance Reduction for Bayesian Inference

    Zhize Li, Tianyi Zhang, Shuyu Cheng, Jun Zhu, Jian Li

    Machine Learning, 2019, journal.

  19. Stabilized SVRG: Simple Variance Reduction for Nonconvex Optimization

    [\(^*\)alphabetical order] Rong Ge\(^*\), Zhize Li\(^*\), Weiyao Wang\(^*\), Xiang Wang\(^*\)

    32nd Conference on Learning Theory (COLT 2019)

  20. Learning Two-layer Neural Networks with Symmetric Inputs

    [\(^*\)alphabetical order] Rong Ge\(^*\), Rohith Kuditipudi\(^*\), Zhize Li\(^*\), Xiang Wang\(^*\)

    7th International Conference on Learning Representations (ICLR 2019)

  21. Gradient Boosting With Piece-Wise Linear Regression Trees

    Yu Shi, Jian Li, Zhize Li

    28th International Joint Conference on Artificial Intelligence (IJCAI 2019)

  22. A Simple Proximal Stochastic Gradient Method for Nonsmooth Nonconvex Optimization

    Zhize Li, Jian Li

    32nd Conference on Neural Information Processing Systems (NeurIPS 2018). (spotlight 198/4856 = 4%)

  23. A Fast Polynomial-time Primal-Dual Projection Algorithm for Linear Programming

    Zhize Li, Wei Zhang, Kees Roos

    arXiv:1810.04517, 2018. A talk presented at the 23rd International Symposium on Mathematical Programming.

  24. A Two-Stage Mechanism for Ordinal Peer Assessment

    Zhize Li, Le Zhang, Zhixuan Fang, Jian Li

    11th International Symposium on Algorithmic Game Theory (SAGT 2018)

  25. Optimal In-Place Suffix Sorting

    Zhize Li, Jian Li, Hongwei Huo

    25th International Symposium on String Processing and Information Retrieval (SPIRE 2018) (invited) [slides]
    Previous one-page [summary] appeared in 28th IEEE Data Compression Conference (DCC 2018)

    Suffix array (as a space-efficient alternative to suffix tree) is a fundamental data structure for many applications that involve string searching and data compression. Designing time/space-efficient suffix array construction algorithms has attracted significant attention and considerable advances have been made for the past 20 years. We give the first linear time in-place suffix array construction algorithm which is optimal both in time and space. In particular, our algorithm solves the important open problem posed by Franceschini and Muthukrishnan in ICALP 2007. Unfortunately, our paper has been prevented from being published since 2017 due to a malicious conflict of interest [a disclaimer].

  26. On Top-k Selection in Multi-Armed Bandits and Hidden Bipartite Graphs

    Wei Cao, Jian Li, Yufei Tao, Zhize Li

    29th Conference on Neural Information Processing Systems (NIPS 2015)


Experience/Services


Teaching


Selected Awards


Contact

Zhize Li

Email: zhizeli DOT thu AT gmail DOT com

zhizeli AT smu DOT edu DOT sg

Address: Room 4054, 80 Stamford Road, Singapore 178902