Abstract:In this talk, I will discuss first-order methods for two problem classes: convex optimization with locally Lipschitz continuous gradient and monotone inclusions with locally Lipschitz continuous point-valued operators. For convex optimization, we will propose a first-order method to find an epsilon-KKT solution, while for monotone inclusions, a primal-dual extrapolation method will be presented to find an epsilon-residual solution. These problem classes extend beyond the well-studied ones in the literature. The proposed methods are parameter-free, with verifiable termination criteria, and exhibit nearly optimal complexity. I will also share some preliminary numerical results to demonstrate their performance.
Biography:Zhaosong Lu is a Full Professor in the Department of Industrial and Systems Engineering at the University of Minnesota. He received his PhD from Georgia Tech in 2005. His research interests include the theory and algorithms for continuous optimization, with applications in machine learning. His research has been supported by NSF and NSERC Canada, and he has published approximately 40 papers in top-tier journals such as SIAM Journal on Optimization, SIAM Journal on Numerical Analysis, SIAM Journal on Scientific Computing, SIAM Journal on Matrix Analysis and Applications, Mathematical Programming, and Mathematics of Operations Research. Currently, he serves as an Associate Editor for SIAM Journal on Optimization, Computational Optimization and Applications, and Big Data and Information Analytics.