Select language
< Return to main menu
zhangjingzhao-877.jpg

Jingzhao Zhang

PI(September 2022 to present)
Special-term Research Fellow、Assistant Professor

Biography

Assistant Professor (2022-) @ Tsinghua, IIIS

 Optimization, learning theory

Ph.D. (2017-2022) @ MIT EECS, LIDS 

 I was coadvised by Prof. Ali Jadbabaie and Prof. Suvrit Sra

B.S. (2013-2016) @ Berkeley, EECS 

 He was advised by Prof. Laura Waller.

Jingzhao Zhang is an assistant professor at Tsinghua, IIIS. He graduated from MIT EECS under the supervision of Prof Ali Jadbabaie and Prof Suvrit Sra. His research focuses on providing theoretical justifications and analyses to practical large-scale optimization algorithms. He is also interested in machine learning applications, especially those involving dynamical system formulations. 


Personal honor:

IIIS Young Scholar Fellowship

MIT Best Master Thesis in AI and Decision making

MIT Best PhD Thesis in AI and Decision making

Berkeley Graduate Fellowship

MIT Lim Graduate Fellowship


Research Direction

Optimization algorithms and complexity

Large-scale neural network training

Learning theory

Machine learning in dynamical systems


Research topic

Machine Learning in Dynamical Systems

Control theory in continuous-time dynamical systems

Neural network training from a dynamical system perspective

Applying neural networks in EV battery systems.


Members

1686820648075.jpg

Open positions

News

Paper/Publication

Wen, Kaiyue, Jiaye Teng, and Jingzhao Zhang. “ Benign Overfitting in Classification: Provably Counter Label Noise with Larger Models.” International Conference on Learning Representations(2023).

Cheng, Xiang, Jingzhao Zhang, and Suvrit Sra. “ Efficient Sampling on Riemannian Manifolds via Langevin MCMC..” Advances in Neural Information Processing Systems 35 (2022): 5995-6006.

Ahn, Kwangjun, Jingzhao Zhang, and Suvrit Sra. “Understanding the unstable convergence of gradient descent.” In International Conference on Machine Learning, pp. 247-257. PMLR, 2022.

Zhang, Jingzhao, Haochuan Li, Suvrit Sra, and Ali Jadbabaie. “Neural Network Weights Do Not Converge to Stationary Points: An Invariant Measure Perspective.” In International

Conference on Machine Learning, pp. 26330-26346. PMLR, 2022.

Gu, Xinran, Kaixuan Huang, Jingzhao Zhang, and Longbo Huang. “Fast federated learning in the presence of arbitrary device unavailability.” Advances in Neural Information Processing Systems 34 (2021): 12052-12064.

Li, Haochuan, Yi Tian, Jingzhao Zhang, and Ali Jadbabaie. “Complexity lower bounds for nonconvex-strongly-concave min-max optimization.” Advances in Neural Information Processing Systems 34 (2021): 1792-1804.

Zhang, Jingzhao, Hongzhou Lin, Stefanie Jegelka, Suvrit Sra, and Ali Jadbabaie. “Complexity of finding stationary points of nonconvex nonsmooth functions.” In International Conference on Machine Learning, pp. 11173-11182. PMLR, 2020.

Yu, Tiancheng, Yi Tian, Jingzhao Zhang, and Suvrit Sra. “Provably efficient algorithms for multi-objective competitively.” In International Conference on Machine Learning, pp. 12167-12176. PMLR, 2021.

Zhang, Jingzhao, Aditya Menon, Andreas Veit, Srinadh Bhojanapalli, Sanjiv Kumar, and Suvrit Sra. “Coping with label shift via distributionally robust optimization.” ICLR(2020).

Zhang, Jingzhao, Hongzhou Lin, Subhro Das, Suvrit Sra, and Ali Jadbabaie. “Beyond Worst-Case Analysis in Stochastic Approximation: Moment Estimation Improves Instance Complexity.” In International Conference on Machine Learning, pp. 26347-26361. PMLR, 2022.

Jingzhao Zhang, Suvrit Sra, and Ali Jadbabaie.“Acceleration in First Order Quasi-strongly Con-vex Optimization by ODE Discretization.” Conference on Decision and Control(2019)

Zhang, Jingzhao, Sai Praneeth Karimireddy, Andreas Veit, Seungyeon Kim, Sashank Reddi, Sanjiv Kumar, and Suvrit Sra. “Why are adaptive methods good for attention models?.” Advances in Neural Information Processing Systems 33 (2020): 15383-15393.

Jingzhao Zhang, C ́esar A. Uribe, Aryan Mokhtari, and Ali Jadbabaie. “Achieving Acceleration in Distributed Optimization via Direct Discretization of the Heavy-Ball ODE.” American Control Conference (2019).

Jingzhao Zhang, Aryan Mokhtari, Suvrit Sra, and Ali Jadbabaie.“Direct Runge-Kutta Discretization Achieves Acceleration.” NeurIPS Spotlight (2018)

Jingzhao Zhang, Hongyi Zhang, Suvrit Sra. “R-SPIDER: A Fast Riemannian Stochastic Optimization Algorithm with Curvature Independent Rate.” arXiv (2018)

Jingzhao Zhang, Nicolas C. P ́egard, Jingshan Zhong, Hillel Adesnik, and Laura Waller. “3D computer-generated holography by non-convex optimization.” Optica 4, no. 10 (2017)

Nicolas C. P ́egard, Alan R. Mardinly, Jingzhao Zhang, Savitha Sridharan, Laura Waller, and Hillel

Adesnik. “Holographic temporal focusing for 3d photo-activation with single neuron resolution.” In Optics and the Brain, Optical Society of America, 2017.

Jingzhao Zhang, Jingshan Zhong, and Laura Waller. “Nonlinear optimization for partially coherent phase recovery with Abbe’s method.” In Digital Holography and Three-Dimensional Imaging,Optical Society of America, 2016.