Refereed journal article or data article (A1)
Aggregate subgradient method for nonsmooth DC optimization
List of Authors: Bagirov Adil M., Taheri Sona, Joki Kaisa, Karmitsa Napsu, Mäkelä Marko M.
Publisher: SPRINGER HEIDELBERG
Publication year: 2021
Journal: Optimization Letters
Journal name in source: OPTIMIZATION LETTERS
Journal acronym: OPTIM LETT
Volume number: 15
Start page: 83
End page: 96
Number of pages: 14
ISSN: 1862-4472
eISSN: 1862-4480
DOI: http://dx.doi.org/10.1007/s11590-020-01586-z
URL: https://link.springer.com/article/10.1007/s11590-020-01586-z
Self-archived copy’s web address: https://research.utu.fi/converis/portal/detail/Publication/48590187
The aggregate subgradient method is developed for solving unconstrained nonsmooth difference of convex (DC) optimization problems. The proposed method shares some similarities with both the subgradient and the bundle methods. Aggregate subgradients are defined as a convex combination of subgradients computed at null steps between two serious steps. At each iteration search directions are found using only two subgradients: the aggregate subgradient and a subgradient computed at the current null step. It is proved that the proposed method converges to a critical point of the DC optimization problem and also that the number of null steps between two serious steps is finite. The new method is tested using some academic test problems and compared with several other nonsmooth DC optimization solvers.
Downloadable publication This is an electronic reprint of the original article. |