Backtracking line-search aiming merely for a decrease in cost value. function [stepsize, newx, newkey, lsstats] = linesearch_decrease(problem, x, d, f0, df0, options, storedb, key) Line-search algorithm based on a simple backtracking method. The search direction provided has to be a descent direction, but needs not be a first-order descent, i.e.: this line-search can be used even if x is a critical point, as long as the cost function is strictly decreasing along the direction d. The line-search merely guarantees a decrease in the cost (unless a stopping criterion triggers first, such as exceeding a maximal number of iterations). This is typically useful to escape saddle points (critical points admitting descent directions at the second order). Escape directions can be computed using hessianextreme, for example. Below, the step is constructed as alpha*d, and the step size is the norm of that vector, thus: stepsize = alpha*norm_d. The step is executed by retracting the vector alpha*d from the current point x, giving newx. An initial stepsize of norm_d thus means the first candidate x is obtained by retracting d at x, as is. Options: options.ls_max_steps (25): maximum number of cost evaluations. options.ls_initial_stepsize (norm_d): first stepsize trial. options.ls_contraction_factor (0.5): stepsize reduction per iteration. Inputs/Outputs : see help for linesearch. f0 is the cost at x. df0 is unused. options, storedb and key are optional. Thus, a simplified calling pattern is (with all outputs still available): linesearch_decrease(problem, x, d, f0) See also: steepestdescent linesearch hessianextreme

- StoreDB
- getCost Computes the cost function at x.
- mergeOptions Merges two options structures with one having precedence over the other.

- low_rank_dist_completion Perform low-rank distance matrix completion w/ automatic rank detection.

0001 function [stepsize, newx, newkey, lsstats] = ... 0002 linesearch_decrease(problem, x, d, f0, ~, options, storedb, key) 0003 % Backtracking line-search aiming merely for a decrease in cost value. 0004 % 0005 % function [stepsize, newx, newkey, lsstats] = 0006 % linesearch_decrease(problem, x, d, f0, df0, options, storedb, key) 0007 % 0008 % Line-search algorithm based on a simple backtracking method. The search 0009 % direction provided has to be a descent direction, but needs not be a 0010 % first-order descent, i.e.: this line-search can be used even if x is a 0011 % critical point, as long as the cost function is strictly decreasing 0012 % along the direction d. 0013 % 0014 % The line-search merely guarantees a decrease in the cost (unless a 0015 % stopping criterion triggers first, such as exceeding a maximal number of 0016 % iterations). This is typically useful to escape saddle points (critical 0017 % points admitting descent directions at the second order). Escape 0018 % directions can be computed using hessianextreme, for example. 0019 % 0020 % Below, the step is constructed as alpha*d, and the step size is the norm 0021 % of that vector, thus: stepsize = alpha*norm_d. The step is executed by 0022 % retracting the vector alpha*d from the current point x, giving newx. 0023 % An initial stepsize of norm_d thus means the first candidate x is 0024 % obtained by retracting d at x, as is. 0025 % 0026 % Options: 0027 % options.ls_max_steps (25): maximum number of cost evaluations. 0028 % options.ls_initial_stepsize (norm_d): first stepsize trial. 0029 % options.ls_contraction_factor (0.5): stepsize reduction per iteration. 0030 % 0031 % 0032 % Inputs/Outputs : see help for linesearch. 0033 % f0 is the cost at x. 0034 % df0 is unused. 0035 % options, storedb and key are optional. 0036 % Thus, a simplified calling pattern is (with all outputs still 0037 % available): linesearch_decrease(problem, x, d, f0) 0038 % 0039 % See also: steepestdescent linesearch hessianextreme 0040 0041 % This file is part of Manopt: www.manopt.org. 0042 % Original author: Nicolas Boumal, April 8, 2015. 0043 % Contributors: 0044 % Change log: 0045 0046 0047 % Allow omission of the key, and even of storedb. 0048 if ~exist('key', 'var') 0049 if ~exist('storedb', 'var') 0050 storedb = StoreDB(); 0051 end 0052 key = storedb.getNewKey(); 0053 end 0054 0055 norm_d = problem.M.norm(x, d); 0056 0057 % Backtracking default parameters. These can be overwritten in the 0058 % options structure which is passed to the solver. 0059 default_options.ls_contraction_factor = .5; 0060 default_options.ls_initial_stepsize = norm_d; 0061 default_options.ls_max_steps = 25; 0062 0063 if ~exist('options', 'var') || isempty(options) 0064 options = struct(); 0065 end 0066 options = mergeOptions(default_options, options); 0067 0068 contraction_factor = options.ls_contraction_factor; 0069 initial_stepsize = options.ls_initial_stepsize; 0070 max_ls_steps = options.ls_max_steps; 0071 0072 % Initial step size as a mutliplier of d. 0073 alpha = initial_stepsize / norm_d; 0074 0075 % Make the chosen step and compute the cost there. 0076 newx = problem.M.retr(x, d, alpha); 0077 newkey = storedb.getNewKey(); 0078 newf = getCost(problem, newx, storedb, newkey); 0079 cost_evaluations = 1; 0080 0081 % Backtrack while no cost decrease is obtained. 0082 while newf >= f0 0083 0084 % Reduce the step size, 0085 alpha = contraction_factor * alpha; 0086 0087 % and look closer down the line 0088 newx = problem.M.retr(x, d, alpha); 0089 newkey = storedb.getNewKey(); 0090 newf = getCost(problem, newx, storedb, newkey); 0091 cost_evaluations = cost_evaluations + 1; 0092 0093 % Make sure we don't run out of budget 0094 if cost_evaluations >= max_ls_steps 0095 break; 0096 end 0097 0098 end 0099 0100 % If we got here without obtaining a decrease, we reject the step. 0101 % Equal cost is accepted, since if x is critical, it is important to 0102 % move away from x more than it is important to decrease the cost. 0103 if newf > f0 0104 alpha = 0; 0105 newx = x; 0106 newkey = key; 0107 newf = f0; %#ok<NASGU> 0108 end 0109 0110 % As seen outside this function, stepsize is the size of the vector we 0111 % retract to make the step from x to newx. Since the step is alpha*d: 0112 stepsize = alpha * norm_d; 0113 0114 % Return some statistics also, for possible analysis. 0115 lsstats.costevals = cost_evaluations; 0116 lsstats.stepsize = stepsize; 0117 lsstats.alpha = alpha; 0118 0119 end

Generated on Fri 08-Sep-2017 12:43:19 by