ITK  4.1.0
Insight Segmentation and Registration Toolkit
Public Types | Public Member Functions | Static Public Member Functions | Protected Member Functions | Protected Attributes | Private Member Functions | Private Attributes | Friends
itk::QuasiNewtonOptimizerv4 Class Reference

#include <itkQuasiNewtonOptimizerv4.h>

+ Inheritance diagram for itk::QuasiNewtonOptimizerv4:
+ Collaboration diagram for itk::QuasiNewtonOptimizerv4:

List of all members.

Public Types

typedef SmartPointer< const SelfConstPointer
typedef std::vector< HessianTypeHessianArrayType
typedef itk::Array2D
< InternalComputationValueType
HessianType
typedef
Superclass::InternalComputationValueType 
InternalComputationValueType
typedef SmartPointer< SelfPointer
typedef QuasiNewtonOptimizerv4 Self
typedef GradientDescentOptimizerv4 Superclass

Public Member Functions

virtual ::itk::LightObject::Pointer CreateAnother (void) const
virtual const char * GetNameOfClass () const
virtual const DerivativeTypeGetNewtonStep ()
virtual void SetMaximumIterationsWithoutProgress (SizeValueType _arg)
virtual void SetMaximumNewtonStepSizeInPhysicalUnits (InternalComputationValueType _arg)
virtual void StartOptimization ()

Static Public Member Functions

static Pointer New ()

Protected Member Functions

virtual void AdvanceOneStep (void)
void CombineGradientNewtonStep (void)
virtual bool ComputeHessianAndStepWithBFGS (IndexValueType location)
virtual void EstimateNewtonStep ()
virtual void EstimateNewtonStepOverSubRange (const IndexRangeType &subrange)
void ModifyCombinedNewtonStep ()
virtual void PrintSelf (std::ostream &os, Indent indent) const
 QuasiNewtonOptimizerv4 ()
virtual void ResetNewtonStep (IndexValueType location)
virtual ~QuasiNewtonOptimizerv4 ()

Protected Attributes

SizeValueType m_BestIteration
ParametersType m_BestPosition
MeasureType m_BestValue
ParametersType m_CurrentPosition
HessianArrayType m_HessianArray
SizeValueType m_MaximumIterationsWithoutProgress
InternalComputationValueType m_MaximumNewtonStepSizeInPhysicalUnits
DerivativeType m_NewtonStep
std::vector< bool > m_NewtonStepValidFlags
std::string m_NewtonStepWarning
ParametersType m_OptimalStep
DerivativeType m_PreviousGradient
ParametersType m_PreviousPosition
MeasureType m_PreviousValue

Private Member Functions

void operator= (const Self &)
 QuasiNewtonOptimizerv4 (const Self &)

Private Attributes

QuasiNewtonOptimizerv4EstimateNewtonStepThreader::Pointer m_EstimateNewtonStepThreader

Friends

class QuasiNewtonOptimizerv4EstimateNewtonStepThreader

Detailed Description

Implement a Quasi-Newton optimizer with BFGS Hessian estimation.

Second order approximation of the cost function is usually more efficient since it estimates the descent or ascent direction more precisely. However, computation of Hessian is usually expensive or unavailable. Alternatively Quasi-Newton methods can estimate a Hessian from the gradients in previous steps. Here a specific Quasi-Newton method, BFGS, is used to compute the Quasi-Newton steps.

The Quasi-Newton method doesn't produce a valid step sometimes, ex. when the metric function is not a convex locally. In this scenario, the gradient step is used after being scaled properly.

A helper member object, m_ScalesEstimator may be set to estimate parameter scales and step scales. A step scale measures the magnitude of a step and is used for learning rate computation.

When m_ScalesEstimator is set, SetMaximumNewtonStepSizeInPhysicalUnits() may be called to set the maximum step size. If it is not called, m_MaximumNewtonStepSizeInPhysicalUnits defaults to lambda * OptimizerParameterScalesEstimator::EstimateMaximumStepSize(), where lambda is in [1,5].

When m_ScalesEstimator is not set, the parameter scales and learning rates defaults to ones, or can be set by users manually.

Definition at line 60 of file itkQuasiNewtonOptimizerv4.h.


Member Typedef Documentation

Reimplemented from itk::GradientDescentOptimizerv4.

Definition at line 68 of file itkQuasiNewtonOptimizerv4.h.

Type for an array of Hessian matrix for local support

Definition at line 82 of file itkQuasiNewtonOptimizerv4.h.

Type for Hessian matrix in the Quasi-Newton method

Definition at line 79 of file itkQuasiNewtonOptimizerv4.h.

Internal computation type, for maintaining a desired precision

Reimplemented from itk::GradientDescentOptimizerv4.

Definition at line 74 of file itkQuasiNewtonOptimizerv4.h.

Reimplemented from itk::GradientDescentOptimizerv4.

Definition at line 67 of file itkQuasiNewtonOptimizerv4.h.

Standard class typedefs.

Reimplemented from itk::GradientDescentOptimizerv4.

Definition at line 65 of file itkQuasiNewtonOptimizerv4.h.

Reimplemented from itk::GradientDescentOptimizerv4.

Definition at line 66 of file itkQuasiNewtonOptimizerv4.h.


Constructor & Destructor Documentation

virtual itk::QuasiNewtonOptimizerv4::~QuasiNewtonOptimizerv4 ( ) [protected, virtual]

Member Function Documentation

virtual void itk::QuasiNewtonOptimizerv4::AdvanceOneStep ( void  ) [protected, virtual]

Advance one step using the Quasi-Newton step. When the Newton step is invalid, the gradient step will be used.

Reimplemented from itk::GradientDescentOptimizerv4.

Combine a gradient step with a Newton step. The Newton step will be used when it is valid. Otherwise the gradient step will be used.

virtual bool itk::QuasiNewtonOptimizerv4::ComputeHessianAndStepWithBFGS ( IndexValueType  location) [protected, virtual]

Estimate the next Hessian and step with BFGS method. The details of the method are described at http://en.wikipedia.org/wiki/BFGS_method .

virtual::itk::LightObject::Pointer itk::QuasiNewtonOptimizerv4::CreateAnother ( void  ) const [virtual]

Create an object from an instance, potentially deferring to a factory. This method allows you to create an instance of an object that is exactly the same type as the referring object. This is useful in cases where an object has been cast back to a base class.

Reimplemented from itk::GradientDescentOptimizerv4.

virtual void itk::QuasiNewtonOptimizerv4::EstimateNewtonStep ( ) [protected, virtual]

Estimate a Newton step

virtual void itk::QuasiNewtonOptimizerv4::EstimateNewtonStepOverSubRange ( const IndexRangeType subrange) [protected, virtual]

Estimate the quasi-newton step over a given index range.

virtual const char* itk::QuasiNewtonOptimizerv4::GetNameOfClass ( ) const [virtual]

Run-time type information (and related methods).

Reimplemented from itk::GradientDescentOptimizerv4.

Get the most recent Newton step.

Estimate and apply the learning rate(s) for a combined Newton step. A combined Newton step uses the Newton step by default and the gradient step when the Newton step is not valid.

The learning rate is less than 1.0 and is restricted by m_MaximumNewtonStepSizeInPhysicalUnits.

Method for creation through the object factory.

Reimplemented from itk::GradientDescentOptimizerv4.

void itk::QuasiNewtonOptimizerv4::operator= ( const Self ) [private]

Mutex lock to protect modification to the reference count

Reimplemented from itk::GradientDescentOptimizerv4.

virtual void itk::QuasiNewtonOptimizerv4::PrintSelf ( std::ostream &  os,
Indent  indent 
) const [protected, virtual]

Methods invoked by Print() to print information about the object including superclasses. Typically not called by the user (use Print() instead) but used in the hierarchical print process to combine the output of several classes.

Reimplemented from itk::GradientDescentOptimizerv4.

virtual void itk::QuasiNewtonOptimizerv4::ResetNewtonStep ( IndexValueType  location) [protected, virtual]

Reset the Hessian to identity matrix and the Newton step to zeros.

Set the maximum tolerable number of iteration without any progress

Set the maximum step size.

When SetScalesEstimator is called by user, the optimizer will compute learning rates as m_MaximumNewtonStepSizeInPhysicalUnits / m_ScalesEstimator->EstimateStepScale(newtonStep).

If SetMaximumNewtonStepSizeInPhysicalUnits is not called by user, m_MaximumNewtonStepSizeInPhysicalUnits defaults to lambda * m_ScalesEstimator->EstimateMaximumStepSize(),

where EstimateMaximumStepSize returns one voxel spacing and lambda may be in [1,5] according to our experience.

Start and run the optimization

Reimplemented from itk::GradientDescentOptimizerv4.


Friends And Related Function Documentation

Definition at line 187 of file itkQuasiNewtonOptimizerv4.h.


Member Data Documentation

Definition at line 126 of file itkQuasiNewtonOptimizerv4.h.

Definition at line 125 of file itkQuasiNewtonOptimizerv4.h.

The best value so far and relevant information

Definition at line 124 of file itkQuasiNewtonOptimizerv4.h.

The information about the current step

Definition at line 115 of file itkQuasiNewtonOptimizerv4.h.

Threader for Newton step estimation.

Definition at line 194 of file itkQuasiNewtonOptimizerv4.h.

The Hessian with local support

Definition at line 138 of file itkQuasiNewtonOptimizerv4.h.

The maximum tolerable number of iteration without any progress

Definition at line 107 of file itkQuasiNewtonOptimizerv4.h.

The maximum Quasi-Newton step size to restrict learning rates.

Definition at line 135 of file itkQuasiNewtonOptimizerv4.h.

The Quasi-Newton step

Definition at line 129 of file itkQuasiNewtonOptimizerv4.h.

Valid flag for the Quasi-Newton steps

Definition at line 141 of file itkQuasiNewtonOptimizerv4.h.

Warning message during Quasi-Newton step estimation

Definition at line 132 of file itkQuasiNewtonOptimizerv4.h.

Definition at line 116 of file itkQuasiNewtonOptimizerv4.h.

Definition at line 121 of file itkQuasiNewtonOptimizerv4.h.

Definition at line 120 of file itkQuasiNewtonOptimizerv4.h.

The information about the previous step

Definition at line 119 of file itkQuasiNewtonOptimizerv4.h.


The documentation for this class was generated from the following file: