Have a personal or library account? Click to login
Statistical learning for recommending (robust) nonlinear regression methods Cover

Statistical learning for recommending (robust) nonlinear regression methods

By: J. Kalina and  J. Tichavský  
Open Access
|Dec 2019

Abstract

We are interested in comparing the performance of various nonlinear estimators of parameters of the standard nonlinear regression model. While the standard nonlinear least squares estimator is vulnerable to the presence of outlying measurements in the data, there exist several robust alternatives. However, it is not clear which estimator should be used for a given dataset and this question remains extremely difficult (or perhaps infeasible) to be answered theoretically. Metalearning represents a computationally intensive methodology for optimal selection of algorithms (or methods) and is used here to predict the most suitable nonlinear estimator for a particular dataset. The classification rule is learned over a training database of 24 publicly available datasets. The results of the primary learning give an interesting argument in favor of the nonlinear least weighted squares estimator, which turns out to be the most suitable one for the majority of datasets. The subsequent metalearning reveals that tests of normality and heteroscedasticity play a crucial role in finding the most suitable nonlinear estimator.

DOI: https://doi.org/10.2478/jamsi-2019-0008 | Journal eISSN: 1339-0015 | Journal ISSN: 1336-9180
Language: English
Page range: 47 - 59
Published on: Dec 21, 2019
Published by: University of Ss. Cyril and Methodius in Trnava
In partnership with: Paradigm Publishing Services
Publication frequency: 2 issues per year

© 2019 J. Kalina, J. Tichavský, published by University of Ss. Cyril and Methodius in Trnava
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 3.0 License.