数学科学学院学术报告
A novel accelerated mirror descent algorithm for constrained nonconvex optimization problem
without Lipschitz assumptions
报告时间:2024年7月12日星期五 10:10-10:50
报告人: 高雪(河北工业大学)
报告地点:沙河校区国实E404
报告摘要: This talk considers the composite optimization problem where the sum of a smooth nonconvex function and a proper lower semicontinuous function has to be minimized over a closed convex set. The Bregman proximal gradient algorithm and its acceleration variants are the popular methods for solving such a problem numerically. Meanwhile, some recent contributions try to overcome the global Lipschitz condition of smooth part in objective by replacing it with a local one, to establish their global convergence. But, to the best of our knowledge, they still require some extra conditions, including but not limited to the assumption that the abstract set is the whole space. The aim of this paper is to design an accelerated mirror descent algorithm, which uses a convex combination of essentially the whole iteration trajectory, and to show that the local Lipschitz assumption of associated Legendre function together with the Kurdyka-{\L}ojasiewicz property is sufficient to recover its convergence results. In addition to the theoretical improvement in the convergence analysis, it also showcases possible computational advantages which provides an interesting option for practical problems.
报告人简介:高雪,河北工业大学理学院讲师,校内“元光学者”,主要研究非凸非光滑优化问题的相关数值算法及应用;主持国家自然科学基金青年项目与河北省高等学校科学技术项目各1项。
邀请人:崔春风