[Insight-users] Re: About optimizers

Luis Ibanez luis.ibanez at kitware.com
Wed Dec 28 09:16:36 EST 2005


Hi Ionut,

Welcome to ITK!

Please notice that in ITK there is a good set
of optimizers that you can choose from.


You will find the full list in the Doxygen documentation:

http://www.itk.org/Insight/Doxygen/html/de/de3/group__Optimizers.html

    - Amoeba
    - Conjugate Gradient descent
    - FRPROptimizer
    - CumulativeGaussianOptimizer
    - GradientDescentOptimizer
    - LBFGSOptimizer
    - LBFGSBOptimizer
    - LevenbergMarquardtOptimizer
    - OnePlusOneEvolutionaryOptimizer
    - PowellOptimizer
    - RegularStepGradientDescentOptimizer
    - SPSAOptimizer
    - VersorTransformOptimizer
    - VersorRigid3DTransformOptimizer
    - QuaternionRigidTransformGradientDescentOptimizer



Optimizers are also listed in the ITK Software Guide:

       http://www.itk.org/ItkSoftwareGuide.pdf

In Section 8.11, in pdf-pages 458-468.


--


Your observation is correct:



          Conjugate Gradient Descent is

                  "in theory"

           better than Gradient Descent.




This is a widely known "theoretical fact".



However,

Conjugate Gradient is based on the assumption that you have
nice second derivatives, which is only true in artificial
textbook examples, such as the nice artificial ellipses in
the web page that you cite.


In practice,
We don't even have nice first derivatives. Most image metrics are
extremely noisy, as you can verify yourself by doing the exercise
recommended in the ITK Software Guide, Section 8.10.1, pdf-pages
448-451. Once you perform this exercise with your real images and
you Metric of choice, it will be clearer for you why Conjugate
Gradient descent is rarely used in image registration.



You should also take a look at all the Metric plots that are presented
in the ITK Software Guide in the "Image Registration" chapter.


See for example:

    Figure 8.12
    Figure 8.14
    Figure 8.19
    Figure 8.26
    Figure 8.29
    Figure 8.32
    Figure 8.35
    Figure 8.39
    Figure 8.41
    Figure 8.46

	

You will find that they are far from what we could call "Smooth 
Monotonic Functions".




This again, confirms how important is to perform practical experiments,
instead of rejoicing on theoretical assumptions. This is the very reason
why Open Access Journals are the only "Scientific Journals", because
only them make possible to verify practical results. Any other journals
that do not provide source code, data and parameters along with the 
papers, are simply "Vanity Journals", where authors can make empty
claims, such as "Conjugate Gradient is better than Gradient Descent",
and then engage in pointless discussions with reviewers that will make
similar claims.



Please make sure that when you say that one thing is better than
another, you support that claim with an experiment that can be
repeated by others.


You will find that the claims only hold for a very specific set of
conditions that in the case of Image Registration will include:


   - The Image Metric being used as cost function
   - The Images: Modality, Resolution, Intensity distribution...
   - The Transform
   - The Interpolator of choice


as well as the specific parameters that you set in every one
of these components.


It is sad to see how much the corrupt mandate of "Publish of Perish"
has destroyed our critical thinking.



   Regards,



      Luis




---------------------
ionut iorgovan wrote:
> 
> I m new in optimizer world
> It seems that your steepest descent methods is much weaker than 
> conjugate gradients( LBFGS optimizer is part of this category) for BSpline
>  
> http://www.tcm.phy.cam.ac.uk/~pdh1001/thesis/node57.html
> 



More information about the Insight-users mailing list