[Insight-users] LevenbergMarquardt Optimizer & Derivatives

Luis Ibanez luis.ibanez@kitware.com
Sun, 22 Sep 2002 15:56:46 -0400


Hi Mark,


The modifications to the LevenbergMarquardt
optimizer that you suggested have been made.

Here is a summary of the changes:


1) In  itkMultipleValuedVnlCostFunctionAdaptor

   the following methods were added:

   void SetUseGradient(bool);
   void UseGradientOn();
   void UseGradientOff();
   bool GetUseGradient();

   They map to the use_gradient member variable
   of the vnl_least_squares_function class



2) In the itkLevenbergMarquardtOptimizer

    the following methods were added:

   void SetUseCostFunctionGradient(bool);
   void UseCostFunctionGradientOn();
   void UseCostFunctionGradientOff();
   bool GetUseCostFunctionGradient();


  These methods provide access to the methods
  listed in (1) on the VnlAdaptor.


The usage is illustrated in the file:

Insight/Testing/Code/Numerics/itkLevenbergMarquardOptimizerTest.cxx

basically:

   Optimizer->SetCostFunction( costFunction.GetPointer() );

   and the use of the gradient can be enabled by any of
   the following calls:

   costFunction->SetUseGradient( true );

   Optimizer->SetUseCostFunctionGradient( useGradient );

The last one requires the CostFuction to be already
plugged into the optimizer.



Please let us know if you find any problem
with the current implementation.


   Thanks


    Luis