[Insight-users] Re: itkGradientDescentOptimizer

Luis Ibanez luis.ibanez@kitware.com
Mon, 28 Oct 2002 09:59:23 -0500


Hi Digvijay,

Almost any optimization method requires
you to start close to the final response.
Off course some are more robust than others
when you are far away, some are faster than
others.... that's where you have to choose
and tune the selection to the caracteristics
of your particular problem.

You will find the same problem in image
registration. It is hopeless to start an
optimization method very far from the expected
response. Unless you use some sort of
evolutionary algorithm and have no hurry in
getting the answer back.

The point here is "how close" have you find
that the "InitialParameters" must be to the
final parameters...

In a gradient descent approach, you can go
quite far from the final response as long
as the cost function to be optimized is
monotonic between the initial point and the
final point.  Otherewise you will be trapped
in secondary extrema. If it happens that
the cost function is actually populated
with secondary extrema, then you will have
to smooth the function or use an evolutionary
algorithm (for which you have many choices).


Could you please give us an idea of the
numerical values you are dealing with, as
well as the general setup of the problem ?


Thanks


    Luis



================================================

digvijay singh wrote:

 > hi luis!!
 > I have a doubt regarding the setting up of
 > trueparameters. It requires that the final values of
 > parameters be known in a certain accuracy
 > range....isn't that a fallacy in itself....that the
 > results be obtained from the optimization lemme know
 > thanks
 > digvijay