The eigenstep method: A new iterative method for unconstrained quadratic optimization.
Date of Award
Mathematics and Statistics
CC BY-NC-ND 4.0
This thesis presents a new method for the unconstrained minimization of convex quadratic programming problems. The method is an iterative method that is a modification of the classical steepest descent method. The methods are the same in the choice of the negative gradient as the search direction, but differ in the choice of step size. The steepest descent method uses the optimal step size, and the proposed method uses the reciprocal of the eigenvalues of the Hessian matrix as step sizes. Thus, the proposed method is referred to as the eigenstep method. It will be shown that the eigenstep method has finite termination with the number of iterations required being equal to the dimension of the problem, that is, the number of variables. Numerical examples will be provided to illustrate the algorithm, and a comparison is made to other standard optimization methods, including the steepest descent method.Dept. of Mathematics and Statistics. Paper copy at Leddy Library: Theses & Major Papers - Basement, West Bldg. / Call Number: Thesis2004 .B38. Source: Masters Abstracts International, Volume: 44-01, page: 0364. Thesis (M.Sc.)--University of Windsor (Canada), 2005.
Battaglia, John P., "The eigenstep method: A new iterative method for unconstrained quadratic optimization." (2005). Electronic Theses and Dissertations. 3975.