Home > manopt > core > getSubgradient.m

getSubgradient

PURPOSE ^

Computes a subgradient of the cost function at x, up to a tolerance

SYNOPSIS ^

function subgrad = getSubgradient(problem, x, tol, storedb, key)

DESCRIPTION ^

 Computes a subgradient of the cost function at x, up to a tolerance

 function subgrad = getSubgradient(problem, x)
 function subgrad = getSubgradient(problem, x, tol)
 function subgrad = getSubgradient(problem, x, tol, storedb)
 function subgrad = getSubgradient(problem, x, tol, storedb, key)

 Returns a subgradient at x of the cost function described in the problem
 structure. A tolerance tol ( >= 0 ) can also be specified. By default,
 tol = 0.

 storedb is a StoreDB object, key is the StoreDB key to point x.

 See also: getDirectionalDerivative canGetGradient

CROSS-REFERENCE INFORMATION ^

This function calls: This function is called by:

SOURCE CODE ^

0001 function subgrad = getSubgradient(problem, x, tol, storedb, key)
0002 % Computes a subgradient of the cost function at x, up to a tolerance
0003 %
0004 % function subgrad = getSubgradient(problem, x)
0005 % function subgrad = getSubgradient(problem, x, tol)
0006 % function subgrad = getSubgradient(problem, x, tol, storedb)
0007 % function subgrad = getSubgradient(problem, x, tol, storedb, key)
0008 %
0009 % Returns a subgradient at x of the cost function described in the problem
0010 % structure. A tolerance tol ( >= 0 ) can also be specified. By default,
0011 % tol = 0.
0012 %
0013 % storedb is a StoreDB object, key is the StoreDB key to point x.
0014 %
0015 % See also: getDirectionalDerivative canGetGradient
0016 
0017 % This file is part of Manopt: www.manopt.org.
0018 % Original author: Nicolas Boumal, July 20, 2017.
0019 % Contributors:
0020 % Change log:
0021 
0022     % Allow omission of the key, and even of storedb.
0023     if ~exist('key', 'var')
0024         if ~exist('storedb', 'var')
0025             storedb = StoreDB();
0026         end
0027         key = storedb.getNewKey();
0028     end
0029     
0030     % Default tolerance is 0
0031     if ~exist('tol', 'var') || isempty(tol)
0032         tol = 0;
0033     end
0034 
0035     
0036     if isfield(problem, 'subgrad')
0037     %% Compute a subgradient using subgrad.
0038     
0039         % Check whether this function wants to deal with storedb or not.
0040         switch nargin(problem.subgrad)
0041             case 1
0042                 warning('manopt:subgradient', ...
0043                        ['problem.subgrad normally admits a second\n' ...
0044                         'parameter, tol >= 0, as a tolerance.\n']);
0045                 subgrad = problem.subgrad(x); % tol is not passed here
0046             case 2
0047                 subgrad = problem.subgrad(x, tol);
0048             case 3
0049                 % Obtain, pass along, and save the store for x.
0050                 store = storedb.getWithShared(key);
0051                 [subgrad, store] = problem.subgrad(x, tol, store);
0052                 storedb.setWithShared(store, key);
0053             case 4
0054                 % Pass along the whole storedb (by reference), with key.
0055                 subgrad = problem.subgrad(x, tol, storedb, key);
0056             otherwise
0057                 up = MException('manopt:getSubgradient:badsubgrad', ...
0058                     'subgrad should accept 1, 2, 3 or 4 inputs.');
0059                 throw(up);
0060         end
0061     
0062     elseif canGetGradient(problem)
0063     %% The gradient is a subgradient.
0064         
0065         subgrad = getGradient(problem, x, storedb, key);
0066     
0067     else
0068     %% Abandon
0069         
0070         up = MException('manopt:getSubgradient:fail', ...
0071             ['The problem description is not explicit enough to ' ...
0072              'compute a subgradient.']);
0073         throw(up);
0074         
0075     end
0076     
0077 end

Generated on Fri 08-Sep-2017 12:43:19 by m2html © 2005