Journal article
Adaptive Learning for Robust Radial Basis Function Networks
AK Seghouane, N Shokouhi
IEEE Transactions on Cybernetics | Published : 2021
Abstract
This article addresses the robust estimation of the output layer linear parameters in a radial basis function network (RBFN). A prominent method used to estimate the output layer parameters in an RBFN with the predetermined hidden layer parameters is the least-squares estimation, which is the maximum-likelihood (ML) solution in the specific case of the Gaussian noise. We highlight the connection between the ML estimation and minimizing the Kullback-Leibler (KL) divergence between the actual noise distribution and the assumed Gaussian noise. Based on this connection, a method is proposed using a variant of a generalized KL divergence, which is known to be more robust to outliers in the patter..
View full abstractGrants
Awarded by Australian Research Council
Funding Acknowledgements
This work was supported by the Australian Research Council under Grant FT 130101394. This article was recommended by Associate Editor W. X. Zheng.