Home > manopt > solvers > linesearch > linesearch_hint.m

linesearch_hint

PURPOSE ^

Armijo line-search based on the line-search hint in the problem structure.

SYNOPSIS ^

function [stepsize, newx, newkey, lsstats] =linesearch_hint(problem, x, d, f0, df0, options, storedb, key)

DESCRIPTION ^

 Armijo line-search based on the line-search hint in the problem structure.

 function [stepsize, newx, newkey, lsstats] = 
            linesearch_hint(problem, x, d, f0, df0, options, storedb, key)

 Base line-search algorithm for descent methods, based on a simple
 backtracking method. The search direction provided has to be a descent
 direction, as indicated by a negative df0 = directional derivative of f
 at x along d.

 The algorithm obtains an initial step size candidate from the problem
 structure, typically through the problem.linesearch function. If that
 step does not fulfill the Armijo sufficient decrease criterion, that step
 size is reduced geometrically until a satisfactory step size is obtained
 or until a failure criterion triggers. If the problem structure does not
 provide an initial alpha, then alpha = 1 is tried first.
 
 Below, the step is constructed as alpha*d, and the step size is the norm
 of that vector, thus: stepsize = alpha*norm_d. The step is executed by
 retracting the vector alpha*d from the current point x, giving newx.

 Inputs/Outputs : see help for linesearch

 See also: steepestdescent conjugategradients linesearch

CROSS-REFERENCE INFORMATION ^

This function calls: This function is called by:

SOURCE CODE ^

0001 function [stepsize, newx, newkey, lsstats] = ...
0002              linesearch_hint(problem, x, d, f0, df0, options, storedb, key)
0003 % Armijo line-search based on the line-search hint in the problem structure.
0004 %
0005 % function [stepsize, newx, newkey, lsstats] =
0006 %            linesearch_hint(problem, x, d, f0, df0, options, storedb, key)
0007 %
0008 % Base line-search algorithm for descent methods, based on a simple
0009 % backtracking method. The search direction provided has to be a descent
0010 % direction, as indicated by a negative df0 = directional derivative of f
0011 % at x along d.
0012 %
0013 % The algorithm obtains an initial step size candidate from the problem
0014 % structure, typically through the problem.linesearch function. If that
0015 % step does not fulfill the Armijo sufficient decrease criterion, that step
0016 % size is reduced geometrically until a satisfactory step size is obtained
0017 % or until a failure criterion triggers. If the problem structure does not
0018 % provide an initial alpha, then alpha = 1 is tried first.
0019 %
0020 % Below, the step is constructed as alpha*d, and the step size is the norm
0021 % of that vector, thus: stepsize = alpha*norm_d. The step is executed by
0022 % retracting the vector alpha*d from the current point x, giving newx.
0023 %
0024 % Inputs/Outputs : see help for linesearch
0025 %
0026 % See also: steepestdescent conjugategradients linesearch
0027 
0028 % This file is part of Manopt: www.manopt.org.
0029 % Original author: Nicolas Boumal, July 17, 2014.
0030 % Contributors:
0031 % Change log:
0032 %
0033 %   April 3, 2015 (NB):
0034 %       Works with the new StoreDB class system.
0035 %
0036 %   April 8, 2015 (NB):
0037 %       Got rid of lsmem input/output.
0038 %
0039 %   July 20, 2017 (NB):
0040 %       Now using alpha = 1 by default.
0041 %
0042 %   Aug. 28, 2017 (NB):
0043 %       Adding two options: ls_backtrack and ls_force_decrease, both true
0044 %       by default. Setting them to false can disable parts of the line
0045 %       search that, respectively, execute an Armijo backtracking and
0046 %       reject a cost increasing step.
0047 
0048 
0049     % Allow omission of the key, and even of storedb.
0050     if ~exist('key', 'var')
0051         if ~exist('storedb', 'var')
0052             storedb = StoreDB();
0053         end
0054         key = storedb.getNewKey();
0055     end
0056 
0057     % Backtracking default parameters. These can be overwritten in the
0058     % options structure which is passed to the solver.
0059     default_options.ls_contraction_factor = .5;
0060     default_options.ls_suff_decr = 1e-4;
0061     default_options.ls_max_steps = 25;
0062     default_options.ls_backtrack = true;
0063     default_options.ls_force_decrease = true;
0064     
0065     if ~exist('options', 'var') || isempty(options)
0066         options = struct();
0067     end
0068     options = mergeOptions(default_options, options);
0069     
0070     contraction_factor = options.ls_contraction_factor;
0071     suff_decr = options.ls_suff_decr;
0072     max_ls_steps = options.ls_max_steps;
0073     
0074     % Obtain an initial guess at alpha from the problem structure. It is
0075     % assumed that the present line-search is only called when the problem
0076     % structure provides enough information for the call here to work.
0077     if canGetLinesearch(problem)
0078         alpha = getLinesearch(problem, x, d, storedb, key);
0079     else
0080         alpha = 1;
0081     end
0082     
0083     % Make the chosen step and compute the cost there.
0084     newx = problem.M.retr(x, d, alpha);
0085     newkey = storedb.getNewKey();
0086     newf = getCost(problem, newx, storedb, newkey);
0087     cost_evaluations = 1;
0088     
0089     % Backtrack while the Armijo criterion is not satisfied
0090     while options.ls_backtrack && newf > f0 + suff_decr*alpha*df0
0091         
0092         % Reduce the step size,
0093         alpha = contraction_factor * alpha;
0094         
0095         % and look closer down the line
0096         newx = problem.M.retr(x, d, alpha);
0097         newkey = storedb.getNewKey();
0098         newf = getCost(problem, newx, storedb, newkey);
0099         cost_evaluations = cost_evaluations + 1;
0100         
0101         % Make sure we don't run out of budget
0102         if cost_evaluations >= max_ls_steps
0103             break;
0104         end
0105         
0106     end
0107     
0108     % If we got here without obtaining a decrease, we reject the step.
0109     if options.ls_force_decrease && newf > f0
0110         alpha = 0;
0111         newx = x;
0112         newkey = key;
0113         newf = f0; %#ok<NASGU>
0114     end
0115     
0116     % As seen outside this function, stepsize is the size of the vector we
0117     % retract to make the step from x to newx. Since the step is alpha*d:
0118     norm_d = problem.M.norm(x, d);
0119     stepsize = alpha * norm_d;
0120     
0121     % Return some statistics also, for possible analysis.
0122     lsstats.costevals = cost_evaluations;
0123     lsstats.stepsize = stepsize;
0124     lsstats.alpha = alpha;
0125     
0126 end

Generated on Fri 08-Sep-2017 12:43:19 by m2html © 2005