Near Optimal Convergence of Back-Propagation Method using Harmony Search Algorithm

Abdirashid Salad Nur, Nor Haizan Mohd Radzi, Siti Mariyam Shamsuddin


Training Artificial Neural Networks (ANNs) is of great significanceand a difficult task in the field of supervised learning as its performance depends on underlying training algorithm as well as the achievement of the training process. In this paper, three training algorithms namely Back-Propagation Algorithm, Harmony Search Algorithm (HSA) and hybrid BP and HSA called BPHSA are employed for the supervised training of Multi-Layer Perceptron feed  forward type of Neural Networks (NNs)  by giving special attention to hybrid BPHSA. A suitable structure for data representation of NNs is implemented to BPHSA-MLP, HSA-MLP and BP-MLP. The proposed method is empirically tested and verified using five benchmark classification problemswhich are Iris, Glass, Cancer, Wine and Thyroid dataset on training NNs. The MSE, training time, and classification accuracy of hybrid BPHSA are compared with the standard BP and meta-heuristic HSA. The experiments showed that proposed method has better results in terms of convergence error and classification accuracy compared to BP-MLP and HSA-MLPmaking the BPHSA-MLPa promising algorithm for neural network training.




Artificial Neural Networks, Harmony Search, Backpropagation, Classification problem

Full Text:



  • There are currently no refbacks.

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License