optimization algorithms requries gradient function
Morten Niels
09-01-2008, 05:59 PM
Hi, I want to optimize a function,
but I don't have a gradient function.
Do I need to use the simple nelder-mead method.
Or is there some method of calculating the gradient?
thanks in advance
Bill Press
09-02-2008, 08:56 AM
Quasi-Newton (variable metric) methods can work well using derivatives calculated by finite differences, where you only need to be able to calculate the function, not its gradient. This is discussed in detail, and implemented, in NR Third Edition, section 10.9.1. In our experience, this method often involves less total computation than one of the other methods that avoids analytic derivatives, such as Powell's method.
HTH!
kutta
10-06-2008, 08:18 AM
Quasi-Newton (variable metric) methods can work well using derivatives calculated by finite differences, where you only need to be able to calculate the function, not its gradient. This is discussed in detail, and implemented, in NR Third Edition, section 10.9.1. In our experience, this method often involves less total computation than one of the other methods that avoids analytic derivatives, such as Powell's method.
HTH!
I am now having a very fundamental problem to clear the snags that are existing in my newly tried Annealing codes in C++.,though very relevent to its origin(NR).Surprisingly And commendably this coded stuff can be deployed even for the
Author's (BillPress)suggested remedy for the above annealing method.
However before making it usable i request if this can be made errorfree by any of ur specialist,tutoring me at the same time for the concrete output.
Once again my thanks
Though this problem was tried elsewhere section under java,
the same is brought to make simple mistakes removed first before attempting in another language whatsoever.
I hope the author would do the needful help.
Thanks
As
:)