Luke
03-31-2008, 05:56 AM
Hi!
I have a problem that is described in the NR textbook as a characteristic of Levenberg-Marquardt methods. However, I was trying to look into ways to solve it. My problem is this:
When I use Levenberg-Marquardt to do a non-linear least squares fit, the method converges nicely, until it comes quite close to the minimum. There it just seems to hop from point to point close around the minimum, without ever converging further. It doesn't diverge either. For my purposes however, the method hasn't converged enough.
The textbook explains that this is due to the near-degeneracy of the minimum, since the Levenberg-Marquardt method with (near) zero lambda is a generalisation of the method of normal equations.
Has someone else encountered and overcome this problem? Or does someone know of approaches that DON'T work, to save me some work?
Now, I do have an idea about a possible solution though, but I'd greatly appreciate a mathematician's/experienced NR programmer's opinion on it. I'm sorry for not strictly adhering to the notation in the textbook, because it would otherwise be impossible to post this on this forum.
What we are essentially doing for lambda = 0, is trying to minimise |A(x) - b|^2, or |∆b|^2. By defining the Jacobian of A(x) as:
J=∂A(x)/∂x
we solve for an increment to the solution ∆x, according to equations (15.5.9) and (15.5.11), by solving
J^T * J * ∆x = J^T * ∆b.
This is where the generalisation that actually the normal equations are used becomes clear. Since the degeneracy of the minimum in the normal equations is circumvented by using the Singular Value Decomposition, would that not be possible here as well? Thus, apply SVD to the equation:
J * ∆x = ∆b.
Does this change the "flat valley of complicated topography" (p. 690, NR 2nd Ed.) to simply a "flat valley"? Will this remove the hopping around the minimum?
I thought I'd post my question before I try myself, I'll keep the thread updated on any successes/failures. Any other suggestions that might let the method converge further are very welcome!
Kind regards,
Luuk
I have a problem that is described in the NR textbook as a characteristic of Levenberg-Marquardt methods. However, I was trying to look into ways to solve it. My problem is this:
When I use Levenberg-Marquardt to do a non-linear least squares fit, the method converges nicely, until it comes quite close to the minimum. There it just seems to hop from point to point close around the minimum, without ever converging further. It doesn't diverge either. For my purposes however, the method hasn't converged enough.
The textbook explains that this is due to the near-degeneracy of the minimum, since the Levenberg-Marquardt method with (near) zero lambda is a generalisation of the method of normal equations.
Has someone else encountered and overcome this problem? Or does someone know of approaches that DON'T work, to save me some work?
Now, I do have an idea about a possible solution though, but I'd greatly appreciate a mathematician's/experienced NR programmer's opinion on it. I'm sorry for not strictly adhering to the notation in the textbook, because it would otherwise be impossible to post this on this forum.
What we are essentially doing for lambda = 0, is trying to minimise |A(x) - b|^2, or |∆b|^2. By defining the Jacobian of A(x) as:
J=∂A(x)/∂x
we solve for an increment to the solution ∆x, according to equations (15.5.9) and (15.5.11), by solving
J^T * J * ∆x = J^T * ∆b.
This is where the generalisation that actually the normal equations are used becomes clear. Since the degeneracy of the minimum in the normal equations is circumvented by using the Singular Value Decomposition, would that not be possible here as well? Thus, apply SVD to the equation:
J * ∆x = ∆b.
Does this change the "flat valley of complicated topography" (p. 690, NR 2nd Ed.) to simply a "flat valley"? Will this remove the hopping around the minimum?
I thought I'd post my question before I try myself, I'll keep the thread updated on any successes/failures. Any other suggestions that might let the method converge further are very welcome!
Kind regards,
Luuk