Home > manopt > tools > checkdiff.m

checkdiff

PURPOSE ^

Checks the consistency of the cost function and directional derivatives.

SYNOPSIS ^

function checkdiff(problem, x, d, force_gradient)

DESCRIPTION ^

 Checks the consistency of the cost function and directional derivatives.

 function checkdiff(problem)
 function checkdiff(problem, x)
 function checkdiff(problem, x, d)

 checkdiff performs a numerical test to check that the directional
 derivatives defined in the problem structure agree up to first order with
 the cost function at some point x, along some direction d. The test is
 based on a truncated Taylor series (see online Manopt documentation).

 Both x and d are optional and will be sampled at random if omitted.

 See also: checkgradient checkhessian

CROSS-REFERENCE INFORMATION ^

This function calls: This function is called by:

SOURCE CODE ^

0001 function checkdiff(problem, x, d, force_gradient)
0002 % Checks the consistency of the cost function and directional derivatives.
0003 %
0004 % function checkdiff(problem)
0005 % function checkdiff(problem, x)
0006 % function checkdiff(problem, x, d)
0007 %
0008 % checkdiff performs a numerical test to check that the directional
0009 % derivatives defined in the problem structure agree up to first order with
0010 % the cost function at some point x, along some direction d. The test is
0011 % based on a truncated Taylor series (see online Manopt documentation).
0012 %
0013 % Both x and d are optional and will be sampled at random if omitted.
0014 %
0015 % See also: checkgradient checkhessian
0016 
0017 % If force_gradient = true (hidden parameter), then the function will call
0018 % getGradient and infer the directional derivative, rather than call
0019 % getDirectionalDerivative directly. This is used by checkgradient.
0020 
0021 % This file is part of Manopt: www.manopt.org.
0022 % Original author: Nicolas Boumal, Dec. 30, 2012.
0023 % Contributors:
0024 % Change log:
0025 %
0026 %   April 3, 2015 (NB):
0027 %       Works with the new StoreDB class system.
0028 
0029     if ~exist('force_gradient', 'var')
0030         force_gradient = false;
0031     end
0032         
0033     % Verify that the problem description is sufficient.
0034     if ~canGetCost(problem)
0035         error('It seems no cost was provided.');
0036     end
0037     if ~force_gradient && ~canGetDirectionalDerivative(problem)
0038         error('It seems no directional derivatives were provided.');
0039     end
0040     if force_gradient && ~canGetGradient(problem)
0041         % Would normally issue a warning, but this function should only be
0042         % called with force_gradient on by checkgradient, which will
0043         % already have issued a warning.
0044     end
0045         
0046     x_isprovided = exist('x', 'var') && ~isempty(x);
0047     d_isprovided = exist('d', 'var') && ~isempty(d);
0048     
0049     if ~x_isprovided && d_isprovided
0050         error('If d is provided, x must be too, since d is tangent at x.');
0051     end
0052     
0053     % If x and / or d are not specified, pick them at random.
0054     if ~x_isprovided
0055         x = problem.M.rand();
0056     end
0057     if ~d_isprovided
0058         d = problem.M.randvec(x);
0059     end
0060 
0061     % Compute the value f0 at f and directional derivative at x along d.
0062     storedb = StoreDB();
0063     xkey = storedb.getNewKey();
0064     f0 = getCost(problem, x, storedb, xkey);
0065     
0066     if ~force_gradient
0067         df0 = getDirectionalDerivative(problem, x, d, storedb, xkey);
0068     else
0069         grad = getGradient(problem, x, storedb, xkey);
0070         df0 = problem.M.inner(x, grad, d);
0071     end
0072     
0073     % Compute the value of f at points on the geodesic (or approximation
0074     % of it) originating from x, along direction d, for stepsizes in a
0075     % large range given by h.
0076     h = logspace(-8, 0, 51);
0077     value = zeros(size(h));
0078     for i = 1 : length(h)
0079         y = problem.M.exp(x, d, h(i));
0080         ykey = storedb.getNewKey();
0081         value(i) = getCost(problem, y, storedb, ykey);
0082     end
0083     
0084     % Compute the linear approximation of the cost function using f0 and
0085     % df0 at the same points.
0086     model = polyval([df0 f0], h);
0087     
0088     % Compute the approximation error
0089     err = abs(model - value);
0090     
0091     % And plot it.
0092     loglog(h, err);
0093     title(sprintf(['Directional derivative check.\nThe slope of the '...
0094                    'continuous line should match that of the dashed\n'...
0095                    '(reference) line over at least a few orders of '...
0096                    'magnitude for h.']));
0097     xlabel('h');
0098     ylabel('Approximation error');
0099     
0100     line('xdata', [1e-8 1e0], 'ydata', [1e-8 1e8], ...
0101          'color', 'k', 'LineStyle', '--', ...
0102          'YLimInclude', 'off', 'XLimInclude', 'off');
0103     
0104     
0105     % In a numerically reasonable neighborhood, the error should decrease
0106     % as the square of the stepsize, i.e., in loglog scale, the error
0107     % should have a slope of 2.
0108     window_len = 10;
0109     [range, poly] = identify_linear_piece(log10(h), log10(err), window_len);
0110     hold all;
0111     loglog(h(range), 10.^polyval(poly, log10(h(range))), 'LineWidth', 3);
0112     hold off;
0113     
0114     fprintf('The slope should be 2. It appears to be: %g.\n', poly(1));
0115     fprintf(['If it is far from 2, then directional derivatives ' ...
0116              'might be erroneous.\n']);
0117     
0118 end

Generated on Sat 12-Nov-2016 14:11:22 by m2html © 2005