Simple attempt at computing n well distributed points on a sphere in R^d. This is an example of how Manopt can approximate the gradient and even the Hessian of a cost function based on finite differences, even if only the cost function is specified without its derivatives. This functionality is provided only as a help for prototyping, and should not be used to compare algorithms in terms of computation time or accuracy, because the underlying gradient approximation scheme is slow. See also the derivative free solvers for an alternative: pso and neldermead.

- obliquefactory Returns a manifold struct to optimize over matrices w/ unit-norm columns.
- rlbfgs Riemannian limited memory BFGS solver for smooth objective functions.

0001 function X = thomson_problem(n, d) 0002 % Simple attempt at computing n well distributed points on a sphere in R^d. 0003 % 0004 % This is an example of how Manopt can approximate the gradient and even 0005 % the Hessian of a cost function based on finite differences, even if only 0006 % the cost function is specified without its derivatives. 0007 % 0008 % This functionality is provided only as a help for prototyping, and should 0009 % not be used to compare algorithms in terms of computation time or 0010 % accuracy, because the underlying gradient approximation scheme is slow. 0011 % 0012 % See also the derivative free solvers for an alternative: 0013 % pso and neldermead. 0014 0015 % This file is part of Manopt: www.manopt.org. 0016 % Original author: Nicolas Boumal, Nov. 1, 2016 0017 % Contributors: 0018 % Change log: 0019 0020 if ~exist('n', 'var') || isempty(n) 0021 n = 50; 0022 end 0023 if ~exist('d', 'var') || isempty(d) 0024 d = 3; 0025 end 0026 0027 % Define the Thomson problem with 1/r^2 potential. That is: find n points 0028 % x_i on a sphere in R^d such that the sum over all pairs (i, j) of the 0029 % potentials 1/||x_i - x_j||^2 is minimized. Since the points are on a 0030 % sphere, each potential is .5/(1-x_i'*x_j). 0031 problem.M = obliquefactory(d, n); 0032 problem.cost = @(X) sum(sum(triu(1./(1-X'*X), 1))) / n^2; 0033 0034 % Attempt to minimize the cost. Since the gradient is not provided, Manopt 0035 % approximates it with finite differences. This is /slow/, since for each 0036 % gradient approximation, problem.M.dim()+1 calls to the cost function are 0037 % necessary, on top of generating an orthonormal basis of the tangent space 0038 % at each iterate. 0039 % 0040 % Note that it is difficult to reach high accuracy critical points with an 0041 % approximate gradient, hence it may be required to set a less ambitious 0042 % value for the gradient norm tolerance. 0043 opts.tolgradnorm = 1e-4; 0044 0045 % Pick a solver. Both work fairly well on this problem. 0046 % X = conjugategradient(problem, [], opts); 0047 X = rlbfgs(problem, [], opts); 0048 0049 % Plot the points on a translucid sphere 0050 if nargout == 0 && d == 3 0051 [x, y, z] = sphere(50); 0052 surf(x, y, z, 'FaceAlpha', .5); 0053 hold all; 0054 plot3(X(1, :), X(2, :), X(3, :), '.', 'MarkerSize', 20); 0055 axis equal; 0056 box off; 0057 axis off; 0058 end 0059 0060 % For much better performance, after an early prototyping phase, the 0061 % gradient of the cost function should be specified, typically in 0062 % problem.grad or problem.egrad. See the online document at 0063 % 0064 % http://www.manopt.org 0065 % 0066 % for more information. 0067 0068 end

Generated on Mon 10-Sep-2018 11:48:06 by