[Insight-users] Regular Step Gradient Descent optimizer

Luis Ibanez luis.ibanez at kitware.com
Wed May 3 12:21:38 EDT 2006


Hi Ming,

The sad reality of image registration is that the
cost functions (Image Metrics) that we optimizer
are most of the time far from being smooth
(e.g. nicely differentiable).

As a consequence, it is not realistic to expect that
different optimizers will lead to similar results.
It is not even the case that the same optimizer will
arrive to the same results if you tweak its parameters.

That's actually one of the reasons why the registration
framework was implemented modularly. This approach permits
to select different implementation of each component, for
example optimizers, metrics, and interpolators.

What is very dangerous in image registration is to fall in the
temptation of drawing conclusions such as "Optimizer X is better
than Optimizer Y", or "Metric A is better than Metric B". Such
generalizations are usually unfunded because optimizers, and
metrics have many parameters that need to be fine tunned for
the specific conditions of the image registration problem at
hand.

It is rarely the case that we will have the time for fully
exploring all possible combinations of such parameters.
In practice we follow heuristics and fine tune the parameters
by trial and error, with some guidance derived from the trace
output of the optimizer iterations.

The fact that the two registration methods are not arriving
at the same result in your case, should not lead to the conclusion
that Method A is better than Method B. Instead, it should lead
to conclude that the parameters of Method A are not yet selecting
the values that are most appropriate for the image registration
problem that you are trying to solve.

In the effort for finding such parameters, one fundamental
exercise is the characterization of the noise in the image metric.
This is illustrated in many cases in the ITK Software Guide.

You may want to perform such analysis in the pair of images
that you are registering and for the specific image metric
that you are using.

Once you are familiar with the characteristics of the Metric
landscape, you will be in a better position for making educated
decisions about the parameters to be used in the optimizer,
or even, what optimizer to apply to the problem.


    Regards,



        Luis



================
Ming Chao wrote:
> HI Luis,
> Thanks for your prompt reply. Yes, I did set the optimizer (Regular Step 
> Gradient Descent optimizer) as the following way which I forgot to 
> include in last posting.
>  
>  optimizer->SetMaximumStepLength( 0.50000 );
>  optimizer->SetMinimumStepLength( 0.01 );
>  optimizer->SetNumberOfIterations( 200 );
>  optimizer->SetGradientMagnitudeTolerance( 
> 0.01*optimizer->GetGradientMagnitudeTolerance() );
>  
> Later on I tried to increase the step lengths as:
>  
>  optimizer->SetMaximumStepLength( 1.000 );
>  optimizer->SetMinimumStepLength( 0.05 );
>  
> And I had the following metric output:
>  
> 0   -0.835581   [-0.0321978, -0.466639, 0.883861]
> 1   -0.840282   [0.399061, -0.885096, 1.68318]
> 2   -0.837371   [0.259052, -0.883767, 2.67333]
> 3   -0.824043   [0.75875, -0.889684, 2.68965]
> 4   -0.829629    [0.55809, -0.881664, 2.83855]
> 5   -0.82694   [0.679164, -0.884414, 2.80759]
> 6   -0.828347   [0.623941, -0.882636, 2.8368]
> 7   -0.827662   [0.566916, -0.881064, 2.86234]
>  
> The only change is the step length but the results are similar to the 
> previous one. By the way for the LBFGSB optimizer I used the following 
> conditions:
>  
>       // (1) LBFGSB optimizer
>    OptimizerType::BoundSelectionType boundSelect( 
> transform->GetNumberOfParameters() );
>    OptimizerType::BoundValueType upperBound( 
> transform->GetNumberOfParameters() );
>    OptimizerType::BoundValueType lowerBound( 
> transform->GetNumberOfParameters() );
>    boundSelect.Fill(  0 );
>    upperBound.Fill(  10.0 );
>    lowerBound.Fill( -10.0 );
>    optimizer->SetBoundSelection( boundSelect );
>    optimizer->SetUpperBound( upperBound );
>    optimizer->SetLowerBound( lowerBound );
>    optimizer->SetMaximumNumberOfEvaluations( 200 );
>    optimizer->SetMaximumNumberOfCorrections( 200 );
>  
> Here I am not clear what you meant for best setting. I thought the 
> conditions I provided were reasonable. Do you see anything obviously 
> different which leads to different results?
>  
> Cheers,
> Ming
>  
> On 5/2/06, *Luis Ibanez* <luis.ibanez at kitware.com 
> <mailto:luis.ibanez at kitware.com>> wrote:
> 
> 
>     Hi Ming,
> 
> 
>     Did you set the optimizer to do Minimization or Maximization ?
>     that is, did you used any of the following:
> 
> 
>           optimizer->MaximizeOn()  ?
>           optimizer->MaximizeOff()  ?
>           optimizer->MinimizeOn()  ?
>           optimizer->MinimizeOff()  ?
> 
>     How much overlap did the images have at the end of the run
>     with the RegularStepNormalized Correlation ?
> 
>     Note that the lasts iterations of the run with
>     RegularStepGradientDescent are advancing at very
>     small steps.
> 
>     You may want to start the optimizer with a larger initial
>     value of the StepLength, and to change the default relaxation
>     factor to be 0.7 or 0.9 instead of the default value 0.5.
> 
>     In this way, the step length will change to 0.7 of the previous
>     value every time that the gradient changes directions.
> 
>     You will get a lot of insight about the registration process
>     by plotting the trace of the translation in 3D. This will
>     show you how much the Transform is changing at every iteration.
>     This is not clearly conveyed just by looking at the numbers.
> 
>     You will find the 2D version of many of these types of plots
>     in the ITK Software Guide,
> 
>         http://www.itk.org/ItkSoftwareGuide.pdf
> 
> 
>     Note that it is not surprising that you get different
>     results from different optimizers, but before you attempt
>     to compare the results you should make sure that you are
>     actually using both optimizers in their best settings and
>     conditions. Otherwise it is just a bias and unfair
>     comparison.
> 
> 
> 
>       Regards,
> 
> 
>          Luis
> 
> 
> 
>     =================
>     Ming Chao wrote:
>      > Hi,
>      >
>      > When I used Regular Step Gradient Descent optimizer to register two
>      > images I saw an abnormal behavior. The metric value first
>     decreased but
>      > after some iterations it became larger. See the following output:
>      >
>      > 0   -0.835581   [-0.00321978, -0.0466639, 0.0883861]
>      > 1   -0.836504   [-0.00422488, -0.0933436, 0.176817]
>      > 2   -0.83737    [-0.00293453, -0.139983, 0.265265]
>      > 3   -0.838174   [0.000731481, -0.18652, 0.353701]
>      > 4   -0.838908   [0.00685175, -0.232884, 0.442092]
>      > 5   -0.839563   [0.015502, -0.278993, 0.530404]
>      > 6   - 0.840129   [0.0267537, -0.32476, 0.618602]
>      > 7   -0.840598   [0.0406717, -0.370081, 0.706649]
>      > 8   -0.84096   [ 0.0573129, -0.414844, 0.794509]
>      > 9   -0.84121   [0.0767229, -0.458919, 0.882148]
>      > 10   - 0.841343   [0.0989338, -0.50216, 0.969537]
>      > 11   -0.841358   [0.123961, -0.544401, 1.05665]
>      > 12   -0.841256   [0.151799, -0.58545 , 1.14349]
>      > 13   -0.841041   [0.182419, -0.625084, 1.23004]
>      > 14   - 0.840719   [0.215761, -0.663034, 1.31634]
>      > 15   -0.840298   [0.214669, -0.680096, 1.41487]
>      > 16   -0.839375   [0.216594, -0.696938, 1.51342]
>      > 17   - 0.838339   [0.221686, -0.713544, 1.6119]
>      > 18   - 0.837215   [0.230089, -0.729893, 1.7102]
>      > 19   -0.836032   [0.241937, -0.74596, 1.80819]
>      > 20   -0.83482   [0.257347, -0.76171, 1.90573]
>      > 21   -0.833613   [0.276409, - 0.7771, 2.00268]
>      > 22   -0.832445    [0.299184, -0.792076, 2.09889]
>      > 23   -0.831349   [0.325688, -0.806571, 2.19422]
>      > 24   -0.830357   [0.35589, -0.820502, 2.28853]
>      > 25   -0.829494   [0.389705, -0.833768, 2.3817]
>      > 26   -0.828783    [0.426981, -0.84624, 2.47365]
>      > 27   -0.828236   [0.467495, -0.857749, 2.56435]
>      > 28   -0.827859   [0.510938, -0.868055, 2.65383]
>      > 29   -0.827633   [0.556896, -0.876774, 2.74221]
>      > 30   -0.827525    [ 0.60475, -0.883034, 2.8298]
>      > 31   -0.827502   [0.650812, -0.875321, 2.91822]
>      > 32   -0.827512   [0.629235, -0.881114, 2.87349]
>      > 33   -0.827519   [0.603546, -0.883645, 2.83067]
>      > 34   -0.827483    [0.620407, -0.880267 , 2.84881]
>      > 35   -0.827564   [0.608674, -0.882388, 2.84506]
>      > 36   -0.827459   [0.613038, -0.88246, 2.84059]
>      > 37   -0.827529   [0.610107, -0.882339, 2.84166]
>      > 38   -0.827493    [0.611332, -0.882386, 2.8407]
>      >
>      > However, if I change the optimizer to the LBFGSB optimizer, I got the
>      > following output:
>      >
>      > 0   -0.840282   [-0.0321978, -0.466639, 0.883861]
>      > 1   - 0.840732   [0.0493198, -0.53362, 1.01199]
>      > 2   -0.840735   [0.0592661, -0.540774, 1.03432]
>      > 3   -0.840751   [0.100359, -0.570829, 1.11035]
>      >
>      > This looks reasonable. The setup for the registration is the
>     following:
>      >
>      >
>      >    typedef itk::BSplineInterpolateImageFunction<ImageType, double >
>      > InterpolatorType;
>      >   typedef itk::ImageRegistrationMethod<ImageType, ImageType >
>      > RegistrationType;
>      >
>      >   typedef
>      > itk::NormalizedCorrelationImageToImageMetric<ImageType,ImageType >
>      > MetricType;
>      >
>      >     typedef itk::TranslationTransform< double, Dimension >
>      > TransformType;
>      >
>      > Can anybody tell me why I have so different results with different
>      > optimizers?
>      >
>      > Thanks,
>      >
>      > Ming
>      >
>      >
>      >
>      >
>      >
>     ------------------------------------------------------------------------
> 
>      >
>      > _______________________________________________
>      > Insight-users mailing list
>      > Insight-users at itk.org <mailto:Insight-users at itk.org>
>      > http://www.itk.org/mailman/listinfo/insight-users
> 
> 
> 




More information about the Insight-users mailing list