Aggregate subgradient method for nonsmooth DC optimization




Bagirov Adil M., Taheri Sona, Joki Kaisa, Karmitsa Napsu, Mäkelä Marko M.

PublisherSPRINGER HEIDELBERG

2021

Optimization Letters

OPTIMIZATION LETTERS

OPTIM LETT

15

83

96

14

1862-4472

1862-4480

DOIhttps://doi.org/10.1007/s11590-020-01586-z(external)

https://link.springer.com/article/10.1007/s11590-020-01586-z(external)

https://research.utu.fi/converis/portal/detail/Publication/48590187(external)



The aggregate subgradient method is developed for solving unconstrained nonsmooth difference of convex (DC) optimization problems. The proposed method shares some similarities with both the subgradient and the bundle methods. Aggregate subgradients are defined as a convex combination of subgradients computed at null steps between two serious steps. At each iteration search directions are found using only two subgradients: the aggregate subgradient and a subgradient computed at the current null step. It is proved that the proposed method converges to a critical point of the DC optimization problem and also that the number of null steps between two serious steps is finite. The new method is tested using some academic test problems and compared with several other nonsmooth DC optimization solvers.

Last updated on 2024-26-11 at 14:38