[Insight-users] Parameter scales for registration (second try)
Nick Tustison
ntustison at gmail.com
Tue May 7 09:27:02 EDT 2013
Hi Brad,
I certainly don't disagree with Joel's findings. It seems like a
good fix which should be put up on gerrit. There were several
components that we kept in upgrading the registration framework.
The optimizers weren't one of them.
Also, could you elaborate a bit more on the "convoluted" aspects
of parameter advancement? There's probably a reason for it and
we could explain why.
Nick
On May 7, 2013, at 8:58 AM, Bradley Lowekamp <blowekamp at mail.nih.gov> wrote:
> Nick,
>
> What we are observing is an algorithmic bug in the RegularStepGradientOptimzer. The ITKv4 optimizers have quite a convoluted way to advance the parameters, and likely don't contain this bug.
>
>
> I think the figure Joel put together does a good job of illustrating the issue:
>
> http://i.imgur.com/DE6xqQ5.pnggithu
>
>
> It just I think the math here:
> https://github.com/Kitware/ITK/blob/master/Modules/Numerics/Optimizers/src/itkRegularStepGradientDescentOptimizer.cxx#L44
>
> newPosition[j] = currentPosition[j] + transformedGradient[j] * factor;
>
> should be:
>
> newPosition[j] = currentPosition[j] + transformedGradient[j] * factor / scales[j];
>
> Brad
>
>
> On May 7, 2013, at 8:07 AM, Nick Tustison <ntustison at gmail.com> wrote:
>
>> Not quite. See below for a relevant block of code.
>> The optimizer can take an optional scales estimator.
>>
>>
>> typedef itk::RegistrationParameterScalesFromPhysicalShift<MetricType> ScalesEstimatorType;
>> typename ScalesEstimatorType::Pointer scalesEstimator = ScalesEstimatorType::New();
>> scalesEstimator->SetMetric( singleMetric );
>> scalesEstimator->SetTransformForward( true );
>>
>> typedef itk::ConjugateGradientLineSearchOptimizerv4 ConjugateGradientDescentOptimizerType;
>> typename ConjugateGradientDescentOptimizerType::Pointer optimizer = ConjugateGradientDescentOptimizerType::New();
>> optimizer->SetLowerLimit( 0 );
>> optimizer->SetUpperLimit( 2 );
>> optimizer->SetEpsilon( 0.2 );
>> // optimizer->SetMaximumLineSearchIterations( 20 );
>> optimizer->SetLearningRate( learningRate );
>> optimizer->SetMaximumStepSizeInPhysicalUnits( learningRate );
>> optimizer->SetNumberOfIterations( currentStageIterations[0] );
>> optimizer->SetScalesEstimator( scalesEstimator );
>> optimizer->SetMinimumConvergenceValue( convergenceThreshold );
>> optimizer->SetConvergenceWindowSize( convergenceWindowSize );
>> optimizer->SetDoEstimateLearningRateAtEachIteration( this->m_DoEstimateLearningRateAtEachIteration );
>> optimizer->SetDoEstimateLearningRateOnce( !this->m_DoEstimateLearningRateAtEachIteration );
>>
>>
>>
>>
>>
>>
>> On May 7, 2013, at 8:01 AM, Joël Schaerer <joel.schaerer at gmail.com> wrote:
>>
>>> Hi Nick,
>>>
>>> I did indeed have a look at these new classes (not a very thorough one, I must confess). However if I understand correctly they allow estimating the parameter scales, but don't change the way the scales are used by the optimizer?
>>>
>>> joel
>>>
>>> On 07/05/2013 13:52, Nick Tustison wrote:
>>>> Hi Brad,
>>>>
>>>> Have you seen the work we did with the class
>>>>
>>>> http://www.itk.org/Doxygen/html/classitk_1_1RegistrationParameterScalesEstimator.html
>>>>
>>>> and it's derived classes for the v4 framework? They describe
>>>> a couple different approaches to scaling the gradient for use
>>>> with the v4 optimizers.
>>>>
>>>> Nick
>>>>
>>>>
>>>> On May 7, 2013, at 6:59 AM, Bradley Lowekamp <blowekamp at mail.nih.gov> wrote:
>>>>
>>>>> Hello Joel,
>>>>>
>>>>> I have encountered the same issue. I ended up creating my own "ScaledRegularStepGradientDescentOptimizer" derived from the an ITK one. Please find it attached. Please note, I don't think I have migrated this code to ITKv4.... but I not certain.
>>>>>
>>>>> I reported this issue to the ITKv4 registration team, but I am not sure what happened to it.
>>>>>
>>>>> I also tried to make the change in ITK a while ago, and a large number of the registration tests failed... not sure if the results were better or worse, they were just different.
>>>>>
>>>>> Brad
>>>>>
>>>>> <itkScaledRegularStepGradientDescentOptimizer.h>
>>>>>
>>>>> On Apr 25, 2013, at 11:10 AM, Joël Schaerer <joel.schaerer at gmail.com> wrote:
>>>>>
>>>>>
>>>
>>
>
More information about the Insight-users
mailing list