A1 Vertaisarvioitu alkuperäisartikkeli tieteellisessä lehdessä

Nonsmooth Optimization-Based Hyperparameter-Free Neural Networks for Large-Scale Regression




TekijätKarmitsa Napsu, Taheri Sona, Joki Kaisa, Paasivirta Pauliina, Bagirov Adil M., Mäkelä Marko M.

KustantajaMultidisciplinary Digital Publishing Institute (MDPI)

Julkaisuvuosi2023

JournalAlgorithms

Tietokannassa oleva lehden nimiAlgorithms

Artikkelin numero444

Vuosikerta16

Numero9

ISSN1999-4893

eISSN1999-4893

DOIhttps://doi.org/10.3390/a16090444

Verkko-osoitehttps://www.mdpi.com/1999-4893/16/9/444

Rinnakkaistallenteen osoitehttps://research.utu.fi/converis/portal/detail/Publication/181679264


Tiivistelmä

In this paper, a new nonsmooth optimization-based algorithm for solving large-scale regression problems is introduced. The regression problem is modeled as fully-connected feedforward neural networks with one hidden layer, piecewise linear activation, and the 𝐿1-loss functions. A modified version of the limited memory bundle method is applied to minimize this nonsmooth objective. In addition, a novel constructive approach for automated determination of the proper number of hidden nodes is developed. Finally, large real-world data sets are used to evaluate the proposed algorithm and to compare it with some state-of-the-art neural network algorithms for regression. The results demonstrate the superiority of the proposed algorithm as a predictive tool in most data sets used in numerical experiments.


Ladattava julkaisu

This is an electronic reprint of the original article.
This reprint may differ from the original in pagination and typographic detail. Please cite the original version.





Last updated on 2025-27-03 at 21:56