Application of Conjugate Gradient Method for Solution of Regression Models
Abstract
Conjugate gradient (CG) method has played an important role in solving large-scale unconstrained optimization problems that may arise in economics, engineering, sciences, and many more. This is due to its simplicity, low memory requirements as well as global convergence properties. Recent studies on the conjugate gradient methods focus on modification of the CG parameter. However, most of the recent algorithms are complex and difficult to implement when solving the unconstrained optimization problems. Also, there is no much research on applications of conjugate gradient methods to real-world problems. Thus, in this paper, an efficient conjugate gradient method is applied to a real-world problem in regression analysis. A data set is taken and transformed into an objective function. The proposed CG algorithm is used to solve the corresponding objective function and the performance is compared with the classical least square method. The accuracy of each method employed for approximating the functions best fit for the given data set is measured by calculating their relative errors. Based on the error values of tested methods, there is no difference in the overall accuracy of both methods until the 11th decimal. This shows that the proposed CG method is very efficient and a good alternative to the least square method.