ITK
4.2.0
Insight Segmentation and Registration Toolkit
|
#include <itkQuasiNewtonOptimizerv4.h>
Static Public Member Functions | |
static Pointer | New () |
Private Member Functions | |
void | operator= (const Self &) |
QuasiNewtonOptimizerv4 (const Self &) |
Private Attributes | |
QuasiNewtonOptimizerv4EstimateNewtonStepThreader::Pointer | m_EstimateNewtonStepThreader |
Friends | |
class | QuasiNewtonOptimizerv4EstimateNewtonStepThreader |
Implement a Quasi-Newton optimizer with BFGS Hessian estimation.
Second order approximation of the cost function is usually more efficient since it estimates the descent or ascent direction more precisely. However, computation of Hessian is usually expensive or unavailable. Alternatively Quasi-Newton methods can estimate a Hessian from the gradients in previous steps. Here a specific Quasi-Newton method, BFGS, is used to compute the Quasi-Newton steps.
The Quasi-Newton method doesn't produce a valid step sometimes, ex. when the metric function is not a convex locally. In this scenario, the gradient step is used after being scaled properly.
A helper member object, m_ScalesEstimator may be set to estimate parameter scales and step scales. A step scale measures the magnitude of a step and is used for learning rate computation.
When m_ScalesEstimator is set, SetMaximumNewtonStepSizeInPhysicalUnits() may be called to set the maximum step size. If it is not called, m_MaximumNewtonStepSizeInPhysicalUnits defaults to lambda * OptimizerParameterScalesEstimator::EstimateMaximumStepSize(), where lambda is in [1,5].
When m_ScalesEstimator is not set, the parameter scales and learning rates defaults to ones, or can be set by users manually.
Definition at line 60 of file itkQuasiNewtonOptimizerv4.h.
typedef SmartPointer< const Self > itk::QuasiNewtonOptimizerv4::ConstPointer |
Reimplemented from itk::GradientDescentOptimizerv4.
Definition at line 68 of file itkQuasiNewtonOptimizerv4.h.
typedef std::vector<HessianType> itk::QuasiNewtonOptimizerv4::HessianArrayType |
Type for an array of Hessian matrix for local support
Definition at line 82 of file itkQuasiNewtonOptimizerv4.h.
Type for Hessian matrix in the Quasi-Newton method
Definition at line 79 of file itkQuasiNewtonOptimizerv4.h.
typedef Superclass::InternalComputationValueType itk::QuasiNewtonOptimizerv4::InternalComputationValueType |
Internal computation type, for maintaining a desired precision
Reimplemented from itk::GradientDescentOptimizerv4.
Definition at line 74 of file itkQuasiNewtonOptimizerv4.h.
typedef SmartPointer< Self > itk::QuasiNewtonOptimizerv4::Pointer |
Reimplemented from itk::GradientDescentOptimizerv4.
Definition at line 67 of file itkQuasiNewtonOptimizerv4.h.
Standard class typedefs.
Reimplemented from itk::GradientDescentOptimizerv4.
Definition at line 65 of file itkQuasiNewtonOptimizerv4.h.
Reimplemented from itk::GradientDescentOptimizerv4.
Definition at line 66 of file itkQuasiNewtonOptimizerv4.h.
|
protected |
|
protectedvirtual |
|
private |
|
protectedvirtual |
Advance one step using the Quasi-Newton step. When the Newton step is invalid, the gradient step will be used.
Reimplemented from itk::GradientDescentOptimizerv4.
|
protected |
Combine a gradient step with a Newton step. The Newton step will be used when it is valid. Otherwise the gradient step will be used.
|
protectedvirtual |
Estimate the next Hessian and step with BFGS method. The details of the method are described at http://en.wikipedia.org/wiki/BFGS_method .
|
virtual |
Create an object from an instance, potentially deferring to a factory. This method allows you to create an instance of an object that is exactly the same type as the referring object. This is useful in cases where an object has been cast back to a base class.
Reimplemented from itk::GradientDescentOptimizerv4.
|
protectedvirtual |
Estimate a Newton step
|
protectedvirtual |
Estimate the quasi-newton step over a given index range.
|
virtual |
Run-time type information (and related methods).
Reimplemented from itk::GradientDescentOptimizerv4.
|
virtual |
Get the most recent Newton step.
|
protected |
Estimate and apply the learning rate(s) for a combined Newton step. A combined Newton step uses the Newton step by default and the gradient step when the Newton step is not valid.
The learning rate is less than 1.0 and is restricted by m_MaximumNewtonStepSizeInPhysicalUnits.
|
static |
Method for creation through the object factory.
Reimplemented from itk::GradientDescentOptimizerv4.
|
private |
Mutex lock to protect modification to the reference count
Reimplemented from itk::GradientDescentOptimizerv4.
|
protectedvirtual |
Methods invoked by Print() to print information about the object including superclasses. Typically not called by the user (use Print() instead) but used in the hierarchical print process to combine the output of several classes.
Reimplemented from itk::GradientDescentOptimizerv4.
|
protectedvirtual |
Reset the Hessian to identity matrix and the Newton step to zeros.
|
virtual |
Set the maximum tolerable number of iteration without any progress
|
virtual |
Set the maximum step size.
When SetScalesEstimator is called by user, the optimizer will compute learning rates as m_MaximumNewtonStepSizeInPhysicalUnits / m_ScalesEstimator->EstimateStepScale(newtonStep).
If SetMaximumNewtonStepSizeInPhysicalUnits is not called by user, m_MaximumNewtonStepSizeInPhysicalUnits defaults to lambda * m_ScalesEstimator->EstimateMaximumStepSize(),
where EstimateMaximumStepSize returns one voxel spacing and lambda may be in [1,5] according to our experience.
|
virtual |
Start and run the optimization
Reimplemented from itk::GradientDescentOptimizerv4.
|
friend |
Definition at line 187 of file itkQuasiNewtonOptimizerv4.h.
|
protected |
Definition at line 126 of file itkQuasiNewtonOptimizerv4.h.
|
protected |
Definition at line 125 of file itkQuasiNewtonOptimizerv4.h.
|
protected |
The best value so far and relevant information
Definition at line 124 of file itkQuasiNewtonOptimizerv4.h.
|
protected |
The information about the current step
Definition at line 115 of file itkQuasiNewtonOptimizerv4.h.
|
private |
Threader for Newton step estimation.
Definition at line 194 of file itkQuasiNewtonOptimizerv4.h.
|
protected |
The Hessian with local support
Definition at line 138 of file itkQuasiNewtonOptimizerv4.h.
|
protected |
The maximum tolerable number of iteration without any progress
Definition at line 107 of file itkQuasiNewtonOptimizerv4.h.
|
protected |
The maximum Quasi-Newton step size to restrict learning rates.
Definition at line 135 of file itkQuasiNewtonOptimizerv4.h.
|
protected |
The Quasi-Newton step
Definition at line 129 of file itkQuasiNewtonOptimizerv4.h.
|
protected |
Valid flag for the Quasi-Newton steps
Definition at line 141 of file itkQuasiNewtonOptimizerv4.h.
|
protected |
Warning message during Quasi-Newton step estimation
Definition at line 132 of file itkQuasiNewtonOptimizerv4.h.
|
protected |
Definition at line 116 of file itkQuasiNewtonOptimizerv4.h.
|
protected |
Definition at line 121 of file itkQuasiNewtonOptimizerv4.h.
|
protected |
Definition at line 120 of file itkQuasiNewtonOptimizerv4.h.
|
protected |
The information about the previous step
Definition at line 119 of file itkQuasiNewtonOptimizerv4.h.