A Novel approach for Training Neural Network using Linear Programming

  • Ms. Prabavathi, Mr. Yuvraj Talukdar, Mr. Yash Kimtani, Mr. Ajit Kumar Shah

Abstract

In the field of deep learning the neural network training process is an active field of research. Backpropagation algorithm [1] with gradient descent is actively used for training deep neural network. The algorithms help in optimizing the parameters of the network but it is computationally complex. The network needs to be trained through multiple iterations which if iterated too many times may lead to overfitting. The connections of neural network are fixed before the training begins. During the training process only, the weights are modified not the connections. The backpropagation algorithm uses gradient descent to calculate the weights of the network and compare the output of the network with the desired output than adjust the connection weights to minimize the difference between actual and the desired output. In this paper we propose novel optimization technique to replace the backpropagation algorithm. This new optimizing algorithm make use of the simplex algorithm [2] as one of its components along with several other algorithmic techniques. We tested the new system on breast cancer dataset, space shuttle dataset and few others and manage to achieve good results.

Keyworks: Linear Programming, Backpropagation Algorithm, Neural Network.

Published
2020-04-09
How to Cite
Ms. Prabavathi, Mr. Yuvraj Talukdar, Mr. Yash Kimtani, Mr. Ajit Kumar Shah. (2020). A Novel approach for Training Neural Network using Linear Programming. International Journal of Advanced Science and Technology, 29(3), 8224 - 8236. Retrieved from http://sersc.org/journals/index.php/IJAST/article/view/8621
Section
Articles