본문 바로가기 사이드메뉴 바로가기 대메뉴 바로가기

Applied Mathematics and Statistics

News & Events

[Seminar] Accelerated First-order Methods for Large-scale Optimization

AuthorApplied Mathematics & Statistics REG_DATE2019.04.05 Hits103

Speaker: Donghwan Kim, KAIST
Place: B203
Time: 

Abstract
Many modern applications, such as machine learning and statistics, require solving large-dimensional optimization problems. First-order methods, such as a gradient method and a proximal point method, are widely used to solve such large-scale problems, since their computational cost per iteration mildly depends on the problem dimension. However, they suffer from slow convergence rates, compared to second-order methods such as Newton's method. Therefore, accelerating first-order methods has received a great interest, and this led to the development of a conjugate gradient method, a heavy-ball method, and Nesterov's fast gradient method, which we will review in this talk. This talk will then present recently proposed accelerated first-order methods, named optimized gradient method (OGM) and OGM-G.