A New Subgradient Based Method for Nonsmooth DC Programming




Adil M. Bagirov, Sona Taheri, Kaisa Joki, Napsu Karmitsa, Marko M. Mäkelä

PublisherTurku Centre for Computer Science

Turku

2019

TUCS Technical Report

1201

978-952-12-3791-1

1239-1891

https://research.utu.fi/converis/portal/detail/Publication/44006803



The aggregate subgradient method is developed for solving unconstrained nonsmooth difference of convex (DC) optimization problems. The proposed method shares some similarities with both the subgradient and the bundle methods. Aggregate subgradients are defined as a convex combination of subgradients computed at null steps between two serious steps. At each iteration search directions are found using only two subgradients: the aggregate subgradient and a subgradient computed at the current null step. It is proved that the proposed method converges to a critical point of the DC optimization problem and also that the number of null steps between two serious steps is finite. The new method is tested using some academic test problems and compared with several other nonsmooth DC optimization solvers.


Last updated on 2024-26-11 at 11:59