[Insight-users] Doubt about scales parameters in registration

Luis Ibanez luis.ibanez at kitware.com
Tue May 2 10:41:18 EDT 2006


Hi Jerome,


The way the parameter scaling is used by every optimizer depends
on the strategy of the specific optimizer.


In all cases, however, the goal is to make uniform the dynamic
range of the dimensions in the parametric space.



You will see that in the case of the GradientDescent optimizers,
the scale parameters are used for "dividing" the Gradient values,


   eg.: itkRegularStepGradientDescentBaseOptimizer.cxx: line 205


which results in shorter steps done along the direction of parameters
that have high scaling values, and longer steps will be taken along
the parameters that have low scaling values. In a typical case of
a rigid transform, this means that you want to put high scaling values
in the rotation scaling-parameters and low values in the translation
scaling-parameters.



In the case of the OnePlusOne optimizer, the scales parameters are used
for dividing the Radius of the area over which random samples will be
thrown for the next generation of the population.


In this case, small scaling-parameters will result in large radius,
which gives the opportunity for the samples to "walk far" from the
current position along that particular direction.


   e.g: itkOnePlusOneEvolutionaryOptimizer.cxx: line 123


Note that the scaling is used to regulate the radius, so you will
get similar result is you use the following pairs of parameters:


             Radius 1000.0  with Scaling  1000.0
             Radius    1.0  with Scaling     1.0
             Radius 0.0001  with Scaling  0.0001


A similar situation happens with the GradientDescent optimizers,
you could compensate the scaling-parameters with changes in the
StepLength (in the regular step) or with changes in the learning
rate (in the standard gradient descent).


In any of these conditions, it is important, as a sanity check,
to add Command/Observers the optimizers and to monitor how they
evolve at every iteration.


Please let us know if you find any suspicious behavior in the
optimizers.


    Thanks




      Luis



=======================
SCHMID, Jerome wrote:
> Hi,
> 
> I understand correctly the need of scaling in the registration
> parameters of the transformation but I have a doubt concerning the good
> passing of arguments. 
> 
> As suggested by many examples, and the wiki one has to pass a scale
> parameter that will be *multiplied* to the internal data in order to put
> all the parameters into a same dynamic range. E.g:
> 
> // Scale the translation components of the Transform in the Optimizer
> OptimizerType::ScalesType scales( transform->GetNumberOfParameters() );
> const double translationScale = 1000.0; // dynamic range of translations
> const double rotationScale = 1.0; // dynamic range of rotations
> scales[0] = 1.0 / rotationScale;
> scales[1] = 1.0 / rotationScale;
> scales[2] = 1.0 / rotationScale;
> scales[3] = 1.0 / translationScale;
> scales[4] = 1.0 / translationScale;
> scales[5] = 1.0 / translationScale;
> 
> this is typically example for 3D rigid reg. 
> 
> But if I have a look to for instance the OnePlusOne optimizer or the
> Powell one, the scales are *divided*, e.g. from powell code:
> 
> for(unsigned int i=0; i<m_SpaceDimension; i++)
>     {
>     m_LineDirection[i] = m_LineDirection[i] / this->GetScales()[i];
>     }
> 
> A great paper of the insight journal on shape to image reg (
> "Model-Image Registration of Parametric Shape Models: Fitting a Shell to
> the Cochlea" ) based on the OnePlusOne optimizer, uses to set the scales
> at 1000.0 instead of 1/1000.0... 
> 
> Is it done purposely, i.e. these optimizers require such choice, or it
> is simply a misunderstanding on how scale must be used, i.e. divided or
> multiplied?
> 
> Thanks.
> 
> Best Regards,
> 
> Jerome Schmid
> 
> 
> _______________________________________________
> Insight-users mailing list
> Insight-users at itk.org
> http://www.itk.org/mailman/listinfo/insight-users
> 
> 




More information about the Insight-users mailing list