Home > examples > nonlinear_eigenspace.m

nonlinear_eigenspace

PURPOSE ^

Example of nonlinear eigenvalue problem: total energy minimization.

SYNOPSIS ^

function Xsol = nonlinear_eigenspace(L, k, alpha)

DESCRIPTION ^

 Example of nonlinear eigenvalue problem: total energy minimization.

 function Xsol = nonlinear_eigenspace(L, k, alpha)

 L is a discrete Laplacian operator,
 alpha is a given constant, and
 k corresponds to the dimension of the least eigenspace sought. 

 This example demonstrates how to use the Grassmann geometry factory 
 to solve the nonlinear eigenvalue problem as the optimization problem:

 minimize 0.5*trace(X'*L*X) + (alpha/4)*(rho(X)*L\(rho(X))) 
 over X such that X'*X = Identity,

 where L is of size n-by-n,
 X is an n-by-k matrix, and
 rho(X) is the diagonal part of X*X'.

 This example is motivated in the paper
 "A Riemannian Newton Algorithm for Nonlinear Eigenvalue Problems",
 Zhi Zhao, Zheng-Jian Bai, and Xiao-Qing Jin,
 SIAM Journal on Matrix Analysis and Applications, 36(2), 752-774, 2015.

CROSS-REFERENCE INFORMATION ^

This function calls: This function is called by:

SUBFUNCTIONS ^

SOURCE CODE ^

0001 function Xsol = nonlinear_eigenspace(L, k, alpha)
0002 % Example of nonlinear eigenvalue problem: total energy minimization.
0003 %
0004 % function Xsol = nonlinear_eigenspace(L, k, alpha)
0005 %
0006 % L is a discrete Laplacian operator,
0007 % alpha is a given constant, and
0008 % k corresponds to the dimension of the least eigenspace sought.
0009 %
0010 % This example demonstrates how to use the Grassmann geometry factory
0011 % to solve the nonlinear eigenvalue problem as the optimization problem:
0012 %
0013 % minimize 0.5*trace(X'*L*X) + (alpha/4)*(rho(X)*L\(rho(X)))
0014 % over X such that X'*X = Identity,
0015 %
0016 % where L is of size n-by-n,
0017 % X is an n-by-k matrix, and
0018 % rho(X) is the diagonal part of X*X'.
0019 %
0020 % This example is motivated in the paper
0021 % "A Riemannian Newton Algorithm for Nonlinear Eigenvalue Problems",
0022 % Zhi Zhao, Zheng-Jian Bai, and Xiao-Qing Jin,
0023 % SIAM Journal on Matrix Analysis and Applications, 36(2), 752-774, 2015.
0024 %
0025 
0026 
0027 % This file is part of Manopt and is copyrighted. See the license file.
0028 %
0029 % Main author: Bamdev Mishra, June 19, 2015.
0030 % Contributors:
0031 %
0032 % Change log:
0033 
0034 
0035     % If no inputs are provided, generate a  discrete Laplacian operator.
0036     % This is for illustration purposes only.
0037     % The default example corresponds to Case (c) of Example 6.2 of the
0038     % above referenced paper.
0039     
0040     if ~exist('L', 'var') || isempty(L)
0041         n = 100;
0042         L = gallery('tridiag', n, -1, 2, -1);
0043     end
0044     
0045     n = size(L, 1);
0046     assert(size(L, 2) == n, 'L must be square.');
0047     
0048     if ~exist('k', 'var') || isempty(k) || k > n
0049         k = 10;
0050     end
0051     
0052     if ~exist('alpha', 'var') || isempty(alpha)
0053         alpha = 1;
0054     end
0055     
0056     
0057     % Grassmann manifold description
0058     Gr = grassmannfactory(n, k);
0059     problem.M = Gr;
0060     
0061     % Cost function evaluation
0062     problem.cost =  @cost;
0063     function val = cost(X)
0064         rhoX = sum(X.^2, 2); % diag(X*X');
0065         val = 0.5*trace(X'*(L*X)) + (alpha/4)*(rhoX'*(L\rhoX));
0066     end
0067     
0068     % Euclidean gradient evaluation
0069     % Note: Manopt automatically converts it to the Riemannian counterpart.
0070     problem.egrad = @egrad;
0071     function g = egrad(X)
0072         rhoX = sum(X.^2, 2); % diag(X*X');
0073         g = L*X + alpha*diag(L\rhoX)*X;
0074     end
0075     
0076     % Euclidean Hessian evaluation
0077     % Note: Manopt automatically converts it to the Riemannian counterpart.
0078     problem.ehess = @ehess;
0079     function h = ehess(X, U)
0080         rhoX = sum(X.^2, 2); %diag(X*X');
0081         rhoXdot = 2*sum(X.*U, 2); 
0082         h = L*U + alpha*diag(L\rhoXdot)*X + alpha*diag(L\rhoX)*U;
0083     end
0084     
0085     
0086     % Check whether gradient and Hessian computations are correct.
0087     % checkgradient(problem);
0088     % pause;
0089     % checkhessian(problem);
0090     % pause;
0091     
0092     
0093     % Initialization as suggested in above referenced paper.
0094     X = randn(n, k);
0095     [U, S, V] = svd(X, 0); %#ok<ASGLU>
0096     X = U*V';
0097     [U0, S0, V0] = eigs(L + alpha*diag(L\(sum(X.^2, 2))), k,'sm'); %#ok<NASGU,ASGLU>
0098     X0 = U0;
0099   
0100     % Call manoptsolve to automatically call an appropriate solver.
0101     % Note: it calls the trust regions solver as we have all the required
0102     % ingredients, namely, gradient and Hessian, information.
0103     Xsol = manoptsolve(problem, X0);
0104     
0105 end

Generated on Mon 10-Sep-2018 11:48:06 by m2html © 2005