Performance Evaluation of Training Algorithms in Backpropagation Neural Network Approach to Blast-Induced Ground Vibration Prediction

Authors

  • Clement Kweku Arthur, Dr Department of Mining Engineering, University of Mines and Technology, P.O. Box 237, Tarkwa, Ghana https://orcid.org/0000-0002-4954-1532
  • Victor Amoako Temeng, Prof Department of Mining Engineering, University of Mines and Technology, P.O. Box 237, Tarkwa, Ghana
  • Yao Yevenyo Ziggah, Dr Department of Geomatic Engineering, University of Mines and Technology, P.O. Box 237, Tarkwa, Ghana https://orcid.org/0000-0002-9940-1845

Abstract

Backpropagation Neural Network (BPNN) is an artificial intelligence technique that has seen several applications in many fields of science and engineering. It is well-known that, the critical task in developing an effective and accurate BPNN model depends on an appropriate training algorithm, transfer function, number of hidden layers and number of hidden neurons. Despite the numerous contributing factors for the development of a BPNN model, training algorithm is key in achieving optimum BPNN model performance. This study is focused on evaluating and comparing the performance of 13 training algorithms in BPNN for the prediction of blast-induced ground vibration. The training algorithms considered include: Levenberg-Marquardt, Bayesian Regularisation, Broyden–Fletcher–Goldfarb–Shanno (BFGS) Quasi-Newton, Resilient Backpropagation, Scaled Conjugate Gradient, Conjugate Gradient with Powell-Beale Restarts, Fletcher-Powell Conjugate Gradient, Polak-Ribiére Conjugate Gradient, One Step Secant, Gradient Descent with Adaptive Learning Rate, Gradient Descent with Momentum, Gradient Descent and Gradient Descent with Momentum and Adaptive Learning Rate. Using ranking values for the performance indicators of mean squared error (MSE), correlation coefficient (R), number of training epoch (iteration) and the duration for convergence, the performance of the various training algorithms used to build the BPNN models were evaluated. The obtained overall ranking results showed that the BFGS Quasi-Newton algorithm outperformed the other training algorithms even though the Levenberg Marquardt algorithm was found to have the best computational speed and utilised the smallest number of epochs.

Downloads

Published

2020-06-30