Netwon algorithm with backtracking is failing
47 views (last 30 days)
Show older comments
Konstantinos
on 4 Dec 2024 at 10:03
Answered: Konstantinos
on 6 Dec 2024 at 14:12
Hello everyone,
I am implementing the Newton method with backtracking line search, but I am encountering a problem with the dimensions of a matrix after a few iterations, and I can’t figure out why.
The algorithm I am asked to implement is as follows:
And the function that i am working with is the following:
.
Below is the code I am using:
n = 2; m = 20; % Dimensions
c = randn(n, 1); % Random vector c
A = randn(m, n); % Random matrix A
b = abs(randn(m, 1))+1; % Ensure b > 0
% Define the function
f = @(x) (c' * x - sum(log(b - A * x)));
alpha = 0.3;
beta = 0.7;
iter_bt = 0;
%Newton method
x_init = zeros(n,1);% Create a n by 1 dimensional vector
x_n = x_init;
thresh = 10^(-6);
f_gradient = @(x) c'- (A' * (1 ./ (b - A * x)));
F_newton = f(x_n); % Save function values for plotting
X_newton = x_n; % Save solutions for plotting
solved_newton = false;
while ~solved_newton
vec = 1 ./ ((b - A * x_n).^2);
diagonal = diag(vec);
f_hessian = @(x) A' * diagonal * A;
% Compute gradient and Hessian
grad = f_gradient(x_n);
hess = f_hessian(x_n);
% Compute Newton direction
dxn_t = -hess \ grad;
% Compute Newton decrement
lambda_square = grad' * (hess \ grad); % Cancel the - sign in the dxn_t
% Check termination condition
if (lambda_square / 2) <= thresh
solved_newton = true;
continue;
end
% Backtracking line search
tau = backtracking_line_search(f, grad, x_n, dxn_t, alpha, beta, A, b);
% Update iterate
x_n = x_n + tau * dxn_t;
end
fprintf('Newton method converged to solution.\n');
function [tau] = backtracking_line_search(f, grad_f, x, direction, alpha, beta, A, b)
% Backtracking line search
% Inputs:
% f: function handle for objective function
% grad_f: gradient of f at current x
% x: current point
% direction: search direction (e.g., -grad for gradient descent)
% alpha: condition parameter (e.g., 0.3)
% beta: condition parameter (e.g., 0.7)
% A, b: constraints for feasibility check (b - A*x > 0)
% Output:
% tau: step size satisfying the conditions
%
%Note:
% This implementation of the backtracking line search differs
% from the original due to the necessity of the feasibility check.
% Additionally, the order of the algorithm is slightly altered,
% though this should not present an issue.
tau = 1; % Initialize step size
while true
candidate_x = x + tau * direction; % Candidate point
if all(b - A * candidate_x > 0) % Feasibility check
if f(candidate_x) <= f(x) + alpha * tau * grad_f' * direction
break; % Feasible and satisfies all the conditions
end
end
tau = beta * tau; % Reduce step size
end
end
After a few iterations of the while loop, MATLAB throws an error related to dimension mismatch during matrix operations. The issue seems to arise when constructing the Hessian matrix using f_hessian. I suspect something might be wrong with the way diag(vec) or A' * diag(vec) * A is computed.
Any insights, suggestions, or debugging tips would be greatly appreciated! Thank you in advance for your help.
2 Comments
Torsten
on 4 Dec 2024 at 10:50
You forgot to define f (see above).
Please test your code for completeness before posting.
Accepted Answer
Torsten
on 4 Dec 2024 at 13:37
Edited: Torsten
on 4 Dec 2024 at 19:24
rng('default')
n = 2; m = 20;
c = sym('c',[n 1]);
A = sym('A',[m n]);
b = sym('b',[m 1]);
x = sym('x',[n 1]);
f = c.'*x - sum(log(b-A*x));
grad = gradient(f,x)
hess = hessian(f,x);
f = matlabFunction(f,"Vars",{x,A,b,c});
f_gradient = matlabFunction(grad,"Vars",{x,A,b,c});
f_hessian = matlabFunction(hess,"Vars",{x,A,b,c});
A = randn(m, n); % Random matrix A
b = abs(randn(m, 1))+1; % Ensure b > 0
c = randn(n, 1); % Random vector c
alpha = 0.3;
beta = 0.7;
iter_bt = 0;
%Newton method
x_init = zeros(n,1); % Create a n by 1 dimensional vector
x_n = x_init
thresh = 10^(-6);
F_newton = f(x_n,A,b,c) % Save function values for plotting
X_newton = x_n; % Save solutions for plotting
solved_newton = false;
while ~solved_newton
% Compute gradient and Hessian
grad = f_gradient(x_n,A,b,c);
hess = f_hessian(x_n,A,b,c);
% Compute Newton direction
dxn_t = -hess \ grad;
% Compute Newton decrement
lambda_square = -grad' * dxn_t; % Cancel the - sign in the dxn_t
% Check termination condition
if (lambda_square / 2) <= thresh
solved_newton = true;
continue;
end
% Backtracking line search
tau = backtracking_line_search(f, grad, x_n, dxn_t, alpha, beta, A, b, c);
% Update iterate
x_n = x_n + tau * dxn_t
f(x_n,A,b,c)
end
fprintf('Newton method converged to solution.\n');
function [tau] = backtracking_line_search(f, grad_f, x, direction, alpha, beta, A, b, c)
% Backtracking line search
% Inputs:
% f: function handle for objective function
% grad_f: gradient of f at current x
% x: current point
% direction: search direction (e.g., -grad for gradient descent)
% alpha: condition parameter (e.g., 0.3)
% beta: condition parameter (e.g., 0.7)
% A, b: constraints for feasibility check (b - A*x > 0)
% Output:
% tau: step size satisfying the conditions
%
%Note:
% This implementation of the backtracking line search differs
% from the original due to the necessity of the feasibility check.
% Additionally, the order of the algorithm is slightly altered,
% though this should not present an issue.
tau = 1; % Initialize step size
while true
candidate_x = x + tau * direction; % Candidate point
if all(b - A * candidate_x > 0) % Feasibility check
if f(candidate_x,A,b,c) <= f(x,A,b,c) + alpha * tau * grad_f' * direction
break; % Feasible and satisfies all the conditions
end
end
tau = beta * tau; % Reduce step size
end
end
2 Comments
Torsten
on 4 Dec 2024 at 18:55
Is it necessary to use symbolic matrices as part of the solution?
No. But it's the simplest way to get gradient and Hessian without errors.
If this was just a test problem and the dimensions of your "real" problem (m,n) are much larger, you should think about where you made a mistake in computing gradient and/or Hessian in order to avoid symbolic preprocessing.
More Answers (1)
See Also
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!