An improved radial basis function networks in networks weights adjustment for training real-world nonlinear datasets

Lim Eng Aik, Tan Wei Hong, Ahmad Kadri Junoh

Abstract


In neural networks, the accuracies of its networks are mainly relying on two important factors which are the centers and the networks weight. The gradient descent algorithm is a widely used weight adjustment algorithm in most of neural networks training algorithm. However, the method is known for its weakness for easily trap in local minima. It suffers from a random weight generated for the networks during initial stage of training at input layer to hidden layer networks. The performance of radial basis function networks (RBFN) has been improved from different perspectives, including centroid initialization problem to weight correction stage over the years. Unfortunately, the solution does not provide a good trade-off between quality and efficiency of the weight produces by the algorithm. To solve this problem, an improved gradient descent algorithm for finding initial weight and improve the overall networks weight is proposed. This improved version algorithm is incorporated into RBFN training algorithm for updating weight. Hence, this paper presented an improved RBFN in term of algorithm for improving the weight adjustment in RBFN during training process. The proposed training algorithm, which uses improved gradient descent algorithm for weight adjustment for training RBFN, obtained significant improvement in predictions compared to the standard RBFN. The proposed training algorithm was implemented in MATLAB environment. The proposed improved network called IRBFN was tested against the standard RBFN in predictions. The experimental models were tested on four literatures nonlinear function and four real-world application problems, particularly in Air pollutant problem, Biochemical Oxygen Demand (BOD) problem, Phytoplankton problem, and forex pair EURUSD. The results are compared to IRBFN for root mean square error (RMSE) values with standard RBFN. The IRBFN yielded a promising result with an average improvement percentage more than 40 percent in RMSE.

Keywords


Gradient descent; Improved RBFN; Neural network; Radial basis function network; Weight Adjustment

Full Text:

PDF


DOI: http://doi.org/10.11591/ijai.v8.i1.pp63-76
Total views : 553 times

Refbacks

  • There are currently no refbacks.


View IJAI Stats

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.