Home > manopt > solvers > linesearch > linesearch_hint.m

linesearch_hint

PURPOSE ^

Armijo line-search based on the line-search hint in the problem structure.

SYNOPSIS ^

function [stepsize, newx, newkey, lsstats] =linesearch_hint(problem, x, d, f0, df0, options, storedb, key)

DESCRIPTION ^

 Armijo line-search based on the line-search hint in the problem structure.

 function [stepsize, newx, newkey, lsstats] = 
            linesearch_hint(problem, x, d, f0, df0, options, storedb, key)

 Base line-search algorithm for descent methods, based on a simple
 backtracking method. The search direction provided has to be a descent
 direction, as indicated by a negative df0 = directional derivative of f
 at x along d.

 The algorithm obtains an initial step size candidate from the problem
 structure, typically through the problem.linesearch function. If that
 step does not fulfill the Armijo sufficient decrease criterion, that step
 size is reduced geometrically until a satisfactory step size is obtained
 or until a failure criterion triggers.
 
 Below, the step is constructed as alpha*d, and the step size is the norm
 of that vector, thus: stepsize = alpha*norm_d. The step is executed by
 retracting the vector alpha*d from the current point x, giving newx.

 Inputs/Outputs : see help for linesearch

 See also: steepestdescent conjugategradients linesearch

CROSS-REFERENCE INFORMATION ^

This function calls: This function is called by:

SOURCE CODE ^

0001 function [stepsize, newx, newkey, lsstats] = ...
0002              linesearch_hint(problem, x, d, f0, df0, options, storedb, key)
0003 % Armijo line-search based on the line-search hint in the problem structure.
0004 %
0005 % function [stepsize, newx, newkey, lsstats] =
0006 %            linesearch_hint(problem, x, d, f0, df0, options, storedb, key)
0007 %
0008 % Base line-search algorithm for descent methods, based on a simple
0009 % backtracking method. The search direction provided has to be a descent
0010 % direction, as indicated by a negative df0 = directional derivative of f
0011 % at x along d.
0012 %
0013 % The algorithm obtains an initial step size candidate from the problem
0014 % structure, typically through the problem.linesearch function. If that
0015 % step does not fulfill the Armijo sufficient decrease criterion, that step
0016 % size is reduced geometrically until a satisfactory step size is obtained
0017 % or until a failure criterion triggers.
0018 %
0019 % Below, the step is constructed as alpha*d, and the step size is the norm
0020 % of that vector, thus: stepsize = alpha*norm_d. The step is executed by
0021 % retracting the vector alpha*d from the current point x, giving newx.
0022 %
0023 % Inputs/Outputs : see help for linesearch
0024 %
0025 % See also: steepestdescent conjugategradients linesearch
0026 
0027 % This file is part of Manopt: www.manopt.org.
0028 % Original author: Nicolas Boumal, July 17, 2014.
0029 % Contributors:
0030 % Change log:
0031 %
0032 %   April 3, 2015 (NB):
0033 %       Works with the new StoreDB class system.
0034 %
0035 %   April 8, 2015 (NB):
0036 %       Got rid of lsmem input/output.
0037 
0038 
0039     % Allow omission of the key, and even of storedb.
0040     if ~exist('key', 'var')
0041         if ~exist('storedb', 'var')
0042             storedb = StoreDB();
0043         end
0044         key = storedb.getNewKey();
0045     end
0046 
0047     % Backtracking default parameters. These can be overwritten in the
0048     % options structure which is passed to the solver.
0049     default_options.ls_contraction_factor = .5;
0050     default_options.ls_suff_decr = 1e-4;
0051     default_options.ls_max_steps = 25;
0052     
0053     if ~exist('options', 'var') || isempty(options)
0054         options = struct();
0055     end
0056     options = mergeOptions(default_options, options);
0057     
0058     contraction_factor = options.ls_contraction_factor;
0059     suff_decr = options.ls_suff_decr;
0060     max_ls_steps = options.ls_max_steps;
0061     
0062     % Obtain an initial guess at alpha from the problem structure. It is
0063     % assumed that the present line-search is only called when the problem
0064     % structure provides enough information for the call here to work.
0065     alpha = getLinesearch(problem, x, d, storedb, key);
0066     
0067     % Make the chosen step and compute the cost there.
0068     newx = problem.M.retr(x, d, alpha);
0069     newkey = storedb.getNewKey();
0070     newf = getCost(problem, newx, storedb, newkey);
0071     cost_evaluations = 1;
0072     
0073     % Backtrack while the Armijo criterion is not satisfied
0074     while newf > f0 + suff_decr*alpha*df0
0075         
0076         % Reduce the step size,
0077         alpha = contraction_factor * alpha;
0078         
0079         % and look closer down the line
0080         newx = problem.M.retr(x, d, alpha);
0081         newkey = storedb.getNewKey();
0082         newf = getCost(problem, newx, storedb, newkey);
0083         cost_evaluations = cost_evaluations + 1;
0084         
0085         % Make sure we don't run out of budget
0086         if cost_evaluations >= max_ls_steps
0087             break;
0088         end
0089         
0090     end
0091     
0092     % If we got here without obtaining a decrease, we reject the step.
0093     if newf > f0
0094         alpha = 0;
0095         newx = x;
0096         newkey = key;
0097         newf = f0; %#ok<NASGU>
0098     end
0099     
0100     % As seen outside this function, stepsize is the size of the vector we
0101     % retract to make the step from x to newx. Since the step is alpha*d:
0102     norm_d = problem.M.norm(x, d);
0103     stepsize = alpha * norm_d;
0104     
0105     % Return some statistics also, for possible analysis.
0106     lsstats.costevals = cost_evaluations;
0107     lsstats.stepsize = stepsize;
0108     lsstats.alpha = alpha;
0109     
0110 end

Generated on Sat 12-Nov-2016 14:11:22 by m2html © 2005