[Insight-users] Doubt about scales parameters in registration
Balaji Gandhi
bgandhi at mail.ucf.edu
Wed May 3 10:23:11 EDT 2006
Hi Jerome/Luis,
Can you tell me about the algorithms used for ITK Image Registration?
I guess it is Multi-Modality Image Registration.
Thanks,
Balaji
On 5/2/06, Jerome SCHMID <jeromeschmid at surgery.cuhk.edu.hk> wrote:
>
> Dear Luis,
>
> Thanks for your detailed answers as usual!
>
> Well as far as I understand I really have to tweak these scaling values
> for each optimizer...My current code was a test code where by using
> switches I use different optimizers to see which one look more adequate
> for my pb. For this I was hoping by using the same scaling values for
> the rigid parameters will be okay... :-)
>
> Thanks for the advice!
>
>
> Best Regards,
>
> Jerome Schmid
>
> -----Original Message-----
> From: Luis Ibanez [mailto:luis.ibanez at kitware.com]
> Sent: Tuesday, May 02, 2006 10:41 PM
> To: SCHMID, Jerome
> Cc: insight-users
> Subject: Re: [Insight-users] Doubt about scales parameters in
> registration
>
>
> Hi Jerome,
>
>
> The way the parameter scaling is used by every optimizer depends
> on the strategy of the specific optimizer.
>
>
> In all cases, however, the goal is to make uniform the dynamic
> range of the dimensions in the parametric space.
>
>
>
> You will see that in the case of the GradientDescent optimizers,
> the scale parameters are used for "dividing" the Gradient values,
>
>
> eg.: itkRegularStepGradientDescentBaseOptimizer.cxx: line 205
>
>
> which results in shorter steps done along the direction of parameters
> that have high scaling values, and longer steps will be taken along
> the parameters that have low scaling values. In a typical case of
> a rigid transform, this means that you want to put high scaling values
> in the rotation scaling-parameters and low values in the translation
> scaling-parameters.
>
>
>
> In the case of the OnePlusOne optimizer, the scales parameters are used
> for dividing the Radius of the area over which random samples will be
> thrown for the next generation of the population.
>
>
> In this case, small scaling-parameters will result in large radius,
> which gives the opportunity for the samples to "walk far" from the
> current position along that particular direction.
>
>
> e.g: itkOnePlusOneEvolutionaryOptimizer.cxx: line 123
>
>
> Note that the scaling is used to regulate the radius, so you will
> get similar result is you use the following pairs of parameters:
>
>
> Radius 1000.0 with Scaling 1000.0
> Radius 1.0 with Scaling 1.0
> Radius 0.0001 with Scaling 0.0001
>
>
> A similar situation happens with the GradientDescent optimizers,
> you could compensate the scaling-parameters with changes in the
> StepLength (in the regular step) or with changes in the learning
> rate (in the standard gradient descent).
>
>
> In any of these conditions, it is important, as a sanity check,
> to add Command/Observers the optimizers and to monitor how they
> evolve at every iteration.
>
>
> Please let us know if you find any suspicious behavior in the
> optimizers.
>
>
> Thanks
>
>
>
>
> Luis
>
>
>
> =======================
> SCHMID, Jerome wrote:
> > Hi,
> >
> > I understand correctly the need of scaling in the registration
> > parameters of the transformation but I have a doubt concerning the
> good
> > passing of arguments.
> >
> > As suggested by many examples, and the wiki one has to pass a scale
> > parameter that will be *multiplied* to the internal data in order to
> put
> > all the parameters into a same dynamic range. E.g:
> >
> > // Scale the translation components of the Transform in the Optimizer
> > OptimizerType::ScalesType scales( transform->GetNumberOfParameters()
> );
> > const double translationScale = 1000.0; // dynamic range of
> translations
> > const double rotationScale = 1.0; // dynamic range of rotations
> > scales[0] = 1.0 / rotationScale;
> > scales[1] = 1.0 / rotationScale;
> > scales[2] = 1.0 / rotationScale;
> > scales[3] = 1.0 / translationScale;
> > scales[4] = 1.0 / translationScale;
> > scales[5] = 1.0 / translationScale;
> >
> > this is typically example for 3D rigid reg.
> >
> > But if I have a look to for instance the OnePlusOne optimizer or the
> > Powell one, the scales are *divided*, e.g. from powell code:
> >
> > for(unsigned int i=0; i<m_SpaceDimension; i++)
> > {
> > m_LineDirection[i] = m_LineDirection[i] / this->GetScales()[i];
> > }
> >
> > A great paper of the insight journal on shape to image reg (
> > "Model-Image Registration of Parametric Shape Models: Fitting a Shell
> to
> > the Cochlea" ) based on the OnePlusOne optimizer, uses to set the
> scales
> > at 1000.0 instead of 1/1000.0...
> >
> > Is it done purposely, i.e. these optimizers require such choice, or it
> > is simply a misunderstanding on how scale must be used, i.e. divided
> or
> > multiplied?
> >
> > Thanks.
> >
> > Best Regards,
> >
> > Jerome Schmid
> >
> >
> > _______________________________________________
> > Insight-users mailing list
> > Insight-users at itk.org
> > http://www.itk.org/mailman/listinfo/insight-users
> >
> >
>
>
>
>
> _______________________________________________
> Insight-users mailing list
> Insight-users at itk.org
> http://www.itk.org/mailman/listinfo/insight-users
>
--
===============================
Balaji Gandhi
Research Associate
Biomolecular Science Center
University of Central Florida
Phone: 407-823-3387
Fax: 407-823-0956
Project: imgem.ucf.edu
===============================
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://public.kitware.com/pipermail/insight-users/attachments/20060503/493b6e10/attachment.htm
More information about the Insight-users
mailing list