A1 Refereed original research article in a scientific journal
Nonsmooth Optimization-Based Hyperparameter-Free Neural Networks for Large-Scale Regression
Authors: Karmitsa Napsu, Taheri Sona, Joki Kaisa, Paasivirta Pauliina, Bagirov Adil M., Mäkelä Marko M.
Publisher: Multidisciplinary Digital Publishing Institute (MDPI)
Publication year: 2023
Journal: Algorithms
Journal name in source: Algorithms
Article number: 444
Volume: 16
Issue: 9
ISSN: 1999-4893
eISSN: 1999-4893
DOI: https://doi.org/10.3390/a16090444
Web address : https://www.mdpi.com/1999-4893/16/9/444
Self-archived copy’s web address: https://research.utu.fi/converis/portal/detail/Publication/181679264
In this paper, a new nonsmooth optimization-based algorithm for solving large-scale regression problems is introduced. The regression problem is modeled as fully-connected feedforward neural networks with one hidden layer, piecewise linear activation, and the 𝐿1-loss functions. A modified version of the limited memory bundle method is applied to minimize this nonsmooth objective. In addition, a novel constructive approach for automated determination of the proper number of hidden nodes is developed. Finally, large real-world data sets are used to evaluate the proposed algorithm and to compare it with some state-of-the-art neural network algorithms for regression. The results demonstrate the superiority of the proposed algorithm as a predictive tool in most data sets used in numerical experiments.
Downloadable publication This is an electronic reprint of the original article. |