Home > manopt > core > getPartialGradient.m

getPartialGradient

PURPOSE ^

Computes the gradient of a subset of terms in the cost function at x.

SYNOPSIS ^

function grad = getPartialGradient(problem, x, I, storedb, key)

DESCRIPTION ^

 Computes the gradient of a subset of terms in the cost function at x.

 function grad = getPartialGradient(problem, x, I)
 function grad = getPartialGradient(problem, x, I, storedb)
 function grad = getPartialGradient(problem, x, I, storedb, key)

 Assume the cost function described in the problem structure is a sum of
 many terms, as

    f(x) = sum_i f_i(x) for i = 1:d,

CROSS-REFERENCE INFORMATION ^

This function calls: This function is called by:

SOURCE CODE ^

0001 function grad = getPartialGradient(problem, x, I, storedb, key)
0002 % Computes the gradient of a subset of terms in the cost function at x.
0003 %
0004 % function grad = getPartialGradient(problem, x, I)
0005 % function grad = getPartialGradient(problem, x, I, storedb)
0006 % function grad = getPartialGradient(problem, x, I, storedb, key)
0007 %
0008 % Assume the cost function described in the problem structure is a sum of
0009 % many terms, as
0010 %
0011 %    f(x) = sum_i f_i(x) for i = 1:d,
0012 
0013 % where d is specified as d = problem.ncostterms.
0014 %
0015 % For a subset I of 1:d, getPartialGradient obtains the gradient of the
0016 % partial cost function
0017 %
0018 %    f_I(x) = sum_i f_i(x) for i = I.
0019 %
0020 % storedb is a StoreDB object, key is the StoreDB key to point x.
0021 %
0022 % See also: getGradient canGetPartialGradient getPartialEuclideanGradient
0023 
0024 % This file is part of Manopt: www.manopt.org.
0025 % Original author: Nicolas Boumal, June 28, 2016
0026 % Contributors:
0027 % Change log:
0028 
0029 
0030     % Allow omission of the key, and even of storedb.
0031     if ~exist('key', 'var')
0032         if ~exist('storedb', 'var')
0033             storedb = StoreDB();
0034         end
0035         key = storedb.getNewKey();
0036     end
0037     
0038     
0039     % Make sure I is a row vector, so that it is natural to loop over it
0040     % with " for i = I ".
0041     I = (I(:)).';
0042 
0043     
0044     if isfield(problem, 'partialgrad')
0045     %% Compute the partial gradient using partialgrad.
0046     
0047         % Check whether this function wants to deal with storedb or not.
0048         switch nargin(problem.partialgrad)
0049             case 2
0050                 grad = problem.partialgrad(x, I);
0051             case 3
0052                 % Obtain, pass along, and save the store for x.
0053                 store = storedb.getWithShared(key);
0054                 [grad, store] = problem.partialgrad(x, I, store);
0055                 storedb.setWithShared(store, key);
0056             case 4
0057                 % Pass along the whole storedb (by reference), with key.
0058                 grad = problem.partialgrad(x, I, storedb, key);
0059             otherwise
0060                 up = MException('manopt:getPartialGradient:badpartialgrad', ...
0061                     'partialgrad should accept 2, 3 or 4 inputs.');
0062                 throw(up);
0063         end
0064     
0065     elseif canGetPartialEuclideanGradient(problem)
0066     %% Compute the partial gradient using the Euclidean partial gradient.
0067         
0068         egrad = getPartialEuclideanGradient(problem, x, I, storedb, key);
0069         grad = problem.M.egrad2rgrad(x, egrad);
0070 
0071     else
0072     %% Abandon computing the partial gradient.
0073     
0074         up = MException('manopt:getPartialGradient:fail', ...
0075             ['The problem description is not explicit enough to ' ...
0076              'compute the partial gradient of the cost.']);
0077         throw(up);
0078         
0079     end
0080     
0081 end

Generated on Fri 08-Sep-2017 12:43:19 by m2html © 2005