Refereed journal article or data article (A1)

Aggregate subgradient method for nonsmooth DC optimization




List of AuthorsBagirov Adil M., Taheri Sona, Joki Kaisa, Karmitsa Napsu, Mäkelä Marko M.

PublisherSPRINGER HEIDELBERG

Publication year2021

JournalOptimization Letters

Journal name in sourceOPTIMIZATION LETTERS

Journal acronymOPTIM LETT

Volume number15

Start page83

End page96

Number of pages14

ISSN1862-4472

eISSN1862-4480

DOIhttp://dx.doi.org/10.1007/s11590-020-01586-z

URLhttps://link.springer.com/article/10.1007/s11590-020-01586-z

Self-archived copy’s web addresshttps://research.utu.fi/converis/portal/detail/Publication/48590187


Abstract
The aggregate subgradient method is developed for solving unconstrained nonsmooth difference of convex (DC) optimization problems. The proposed method shares some similarities with both the subgradient and the bundle methods. Aggregate subgradients are defined as a convex combination of subgradients computed at null steps between two serious steps. At each iteration search directions are found using only two subgradients: the aggregate subgradient and a subgradient computed at the current null step. It is proved that the proposed method converges to a critical point of the DC optimization problem and also that the number of null steps between two serious steps is finite. The new method is tested using some academic test problems and compared with several other nonsmooth DC optimization solvers.

Downloadable publication

This is an electronic reprint of the original article.
This reprint may differ from the original in pagination and typographic detail. Please cite the original version.




Last updated on 2022-07-04 at 17:55