ITK  4.4.0
Insight Segmentation and Registration Toolkit
Public Types | Public Member Functions | Static Public Member Functions | Protected Member Functions | Protected Attributes | Private Member Functions | Private Attributes | List of all members
itk::GradientDescentOptimizerv4 Class Reference

#include <itkGradientDescentOptimizerv4.h>

+ Inheritance diagram for itk::GradientDescentOptimizerv4:
+ Collaboration diagram for itk::GradientDescentOptimizerv4:

Detailed Description

Gradient descent optimizer.

GradientDescentOptimizer implements a simple gradient descent optimizer. At each iteration the current position is updated according to

\[ p_{n+1} = p_n + \mbox{learningRate} \, \frac{\partial f(p_n) }{\partial p_n} \]

Optionally, the best metric value and matching parameters can be stored and retried via GetValue() and GetCurrentPosition(). See SetReturnBestParametersAndValue().

The user can scale each component of the df / dp in two ways: 1) manually, by setting a scaling vector using method SetScales(). Or, 2) automatically, by assigning a ScalesEstimator using SetScalesEstimator(). When ScalesEstimator is assigned, the optimizer is enabled by default to estimate scales, and can be changed via SetDoEstimateScales(). The scales are estimated and assigned once, during the call to StartOptimization(). This option will override any manually-assigned scales.

The learing rate defaults to 1.0, and can be set in two ways: 1) manually, via SetLearningRate(). Or, 2) automatically, either at each iteration or only at the first iteration, by assigning a ScalesEstimator via SetScalesEstimator(). When a ScalesEstimator is assigned, the optimizer is enabled by default to estimate learning rate only once, during the first iteration. This behavior can be changed via SetDoEstimateLearningRateAtEveryIteration() and SetDoEstimateLearningRateOnce(). For learning rate to be estimated at each iteration, the user must call SetDoEstimateLearningRateAtEveryIteration(true) and SetDoEstimateLearningRateOnce(false). When enabled, the optimizer computes learning rate(s) such that at each step, each voxel's change in physical space will be less than m_MaximumStepSizeInPhysicalUnits. m_LearningRate = m_MaximumStepSizeInPhysicalUnits / m_ScalesEstimator->EstimateStepScale(scaledGradient) where m_MaximumStepSizeInPhysicalUnits defaults to the voxel spacing returned by m_ScalesEstimator->EstimateMaximumStepSize() (which is typically 1 voxel), and can be set by the user via SetMaximumStepSizeInPhysicalUnits(). When SetDoEstimateLearningRateOnce is enabled, the voxel change may become being greater than m_MaximumStepSizeInPhysicalUnits in later iterations.

Note
Unlike the previous version of GradientDescentOptimizer, this version does not have a "maximize/minimize" option to modify the effect of the metric derivative. The assigned metric is assumed to return a parameter derivative result that "improves" the optimization when added to the current parameters via the metric::UpdateTransformParameters method, after the optimizer applies scales and a learning rate.

Definition at line 83 of file itkGradientDescentOptimizerv4.h.

Public Types

typedef SmartPointer< const SelfConstPointer
 
typedef
itk::Function::WindowConvergenceMonitoringFunction
< double > 
ConvergenceMonitoringType
 
typedef Superclass::DerivativeType DerivativeType
 
typedef
Superclass::InternalComputationValueType 
InternalComputationValueType
 
typedef Superclass::MeasureType MeasureType
 
typedef SmartPointer< SelfPointer
 
typedef GradientDescentOptimizerv4 Self
 
typedef
GradientDescentOptimizerBasev4 
Superclass
 
- Public Types inherited from itk::GradientDescentOptimizerBasev4
typedef SmartPointer< const SelfConstPointer
 
typedef MetricType::DerivativeType DerivativeType
 
typedef
Superclass::InternalComputationValueType 
InternalComputationValueType
 
typedef Superclass::MeasureType MeasureType
 
typedef Superclass::MetricType MetricType
 
typedef MetricType::Pointer MetricTypePointer
 
typedef SmartPointer< SelfPointer
 
typedef
GradientDescentOptimizerBasev4 
Self
 
typedef std::ostringstream StopConditionDescriptionType
 
typedef std::string StopConditionReturnStringType
 
enum  StopConditionType {
  MAXIMUM_NUMBER_OF_ITERATIONS,
  COSTFUNCTION_ERROR,
  UPDATE_PARAMETERS_ERROR,
  STEP_TOO_SMALL,
  QUASI_NEWTON_STEP_ERROR,
  CONVERGENCE_CHECKER_PASSED,
  OTHER_ERROR
}
 
typedef ObjectToObjectOptimizerBase Superclass
 
- Public Types inherited from itk::ObjectToObjectOptimizerBase
typedef SmartPointer< const SelfConstPointer
 
typedef
MetricType::InternalComputationValueType 
InternalComputationValueType
 
typedef MetricType::MeasureType MeasureType
 
typedef ObjectToObjectMetricBase MetricType
 
typedef MetricType::Pointer MetricTypePointer
 
typedef
MetricType::NumberOfParametersType 
NumberOfParametersType
 
typedef OptimizerParameters
< double > 
ParametersType
 
typedef SmartPointer< SelfPointer
 
typedef OptimizerParameters
< double > 
ScalesType
 
typedef ObjectToObjectOptimizerBase Self
 
typedef Object Superclass
 
- Public Types inherited from itk::Object
typedef SmartPointer< const SelfConstPointer
 
typedef SmartPointer< SelfPointer
 
typedef Object Self
 
typedef LightObject Superclass
 
- Public Types inherited from itk::LightObject
typedef SmartPointer< const SelfConstPointer
 
typedef SmartPointer< SelfPointer
 
typedef LightObject Self
 

Public Member Functions

virtual ::itk::LightObject::Pointer CreateAnother (void) const
 
virtual void EstimateLearningRate ()
 
virtual const
InternalComputationValueType
GetConvergenceValue ()
 
virtual const
InternalComputationValueType
GetLearningRate ()
 
virtual const
InternalComputationValueType
GetMaximumStepSizeInPhysicalUnits ()
 
virtual const char * GetNameOfClass () const
 
virtual void ResumeOptimization ()
 
virtual void SetConvergenceWindowSize (SizeValueType _arg)
 
virtual void SetLearningRate (InternalComputationValueType _arg)
 
virtual void SetMaximumStepSizeInPhysicalUnits (InternalComputationValueType _arg)
 
virtual void SetMinimumConvergenceValue (InternalComputationValueType _arg)
 
virtual void SetScalesEstimator (OptimizerParameterScalesEstimator *_arg)
 
virtual void StartOptimization (bool doOnlyInitialization=false)
 
virtual void StopOptimization (void)
 
virtual void SetDoEstimateScales (bool _arg)
 
virtual const bool & GetDoEstimateScales ()
 
virtual void DoEstimateScalesOn ()
 
virtual void DoEstimateScalesOff ()
 
virtual void SetDoEstimateLearningRateAtEachIteration (bool _arg)
 
virtual const bool & GetDoEstimateLearningRateAtEachIteration ()
 
virtual void DoEstimateLearningRateAtEachIterationOn ()
 
virtual void DoEstimateLearningRateAtEachIterationOff ()
 
virtual void SetDoEstimateLearningRateOnce (bool _arg)
 
virtual const bool & GetDoEstimateLearningRateOnce ()
 
virtual void DoEstimateLearningRateOnceOn ()
 
virtual void DoEstimateLearningRateOnceOff ()
 
virtual void SetReturnBestParametersAndValue (bool _arg)
 
virtual const bool & GetReturnBestParametersAndValue ()
 
virtual void ReturnBestParametersAndValueOn ()
 
virtual void ReturnBestParametersAndValueOff ()
 
- Public Member Functions inherited from itk::GradientDescentOptimizerBasev4
virtual SizeValueType GetCurrentIteration () const
 
virtual const DerivativeTypeGetGradient ()
 
virtual const SizeValueTypeGetNumberOfIterations ()
 
virtual const StopConditionTypeGetStopCondition ()
 
virtual const
StopConditionReturnStringType 
GetStopConditionDescription () const
 
virtual void SetNumberOfIterations (SizeValueType _arg)
 
virtual void ModifyGradientByScales ()
 
virtual void ModifyGradientByLearningRate ()
 
- Public Member Functions inherited from itk::ObjectToObjectOptimizerBase
virtual const MeasureTypeGetCurrentMetricValue ()
 
const ParametersTypeGetCurrentPosition ()
 
virtual const ThreadIdTypeGetNumberOfThreads ()
 
virtual const ScalesTypeGetScales ()
 
virtual const bool & GetScalesAreIdentity ()
 
const MeasureTypeGetValue ()
 
virtual const ScalesTypeGetWeights ()
 
virtual const bool & GetWeightsAreIdentity ()
 
virtual void SetNumberOfThreads (ThreadIdType number)
 
virtual void SetScales (ScalesType _arg)
 
virtual void SetWeights (ScalesType _arg)
 
virtual void SetMetric (MetricType *_arg)
 
virtual MetricTypeGetModifiableMetric ()
 
virtual const MetricTypeGetMetric () const
 
- Public Member Functions inherited from itk::Object
unsigned long AddObserver (const EventObject &event, Command *)
 
unsigned long AddObserver (const EventObject &event, Command *) const
 
virtual void DebugOff () const
 
virtual void DebugOn () const
 
CommandGetCommand (unsigned long tag)
 
bool GetDebug () const
 
MetaDataDictionaryGetMetaDataDictionary (void)
 
const MetaDataDictionaryGetMetaDataDictionary (void) const
 
virtual ModifiedTimeType GetMTime () const
 
virtual const TimeStampGetTimeStamp () const
 
bool HasObserver (const EventObject &event) const
 
void InvokeEvent (const EventObject &)
 
void InvokeEvent (const EventObject &) const
 
virtual void Modified () const
 
virtual void Register () const
 
void RemoveAllObservers ()
 
void RemoveObserver (unsigned long tag)
 
void SetDebug (bool debugFlag) const
 
void SetMetaDataDictionary (const MetaDataDictionary &rhs)
 
virtual void SetReferenceCount (int)
 
virtual void UnRegister () const
 
- Public Member Functions inherited from itk::LightObject
virtual void Delete ()
 
virtual int GetReferenceCount () const
 
 itkCloneMacro (Self)
 
void Print (std::ostream &os, Indent indent=0) const
 

Static Public Member Functions

static Pointer New ()
 

Protected Member Functions

virtual void AdvanceOneStep (void)
 
 GradientDescentOptimizerv4 ()
 
virtual void PrintSelf (std::ostream &os, Indent indent) const
 
virtual ~GradientDescentOptimizerv4 ()
 
virtual void ModifyGradientByScalesOverSubRange (const IndexRangeType &subrange)
 
virtual void ModifyGradientByLearningRateOverSubRange (const IndexRangeType &subrange)
 
- Protected Member Functions inherited from itk::GradientDescentOptimizerBasev4
 GradientDescentOptimizerBasev4 ()
 
virtual ~GradientDescentOptimizerBasev4 ()
 
- Protected Member Functions inherited from itk::ObjectToObjectOptimizerBase
 ObjectToObjectOptimizerBase ()
 
virtual ~ObjectToObjectOptimizerBase ()
 
- Protected Member Functions inherited from itk::Object
 Object ()
 
bool PrintObservers (std::ostream &os, Indent indent) const
 
virtual void SetTimeStamp (const TimeStamp &time)
 
virtual ~Object ()
 
- Protected Member Functions inherited from itk::LightObject
virtual LightObject::Pointer InternalClone () const
 
 LightObject ()
 
virtual void PrintHeader (std::ostream &os, Indent indent) const
 
virtual void PrintTrailer (std::ostream &os, Indent indent) const
 
virtual ~LightObject ()
 

Protected Attributes

ParametersType m_BestParameters
 
ConvergenceMonitoringType::Pointer m_ConvergenceMonitoring
 
InternalComputationValueType m_ConvergenceValue
 
SizeValueType m_ConvergenceWindowSize
 
MeasureType m_CurrentBestValue
 
InternalComputationValueType m_LearningRate
 
InternalComputationValueType m_MaximumStepSizeInPhysicalUnits
 
InternalComputationValueType m_MinimumConvergenceValue
 
bool m_ReturnBestParametersAndValue
 
OptimizerParameterScalesEstimator::Pointer m_ScalesEstimator
 
- Protected Attributes inherited from itk::GradientDescentOptimizerBasev4
SizeValueType m_CurrentIteration
 
DerivativeType m_Gradient
 
GradientDescentOptimizerBasev4ModifyGradientByLearningRateThreader::Pointer m_ModifyGradientByLearningRateThreader
 
GradientDescentOptimizerBasev4ModifyGradientByScalesThreader::Pointer m_ModifyGradientByScalesThreader
 
SizeValueType m_NumberOfIterations
 
bool m_Stop
 
StopConditionType m_StopCondition
 
StopConditionDescriptionType m_StopConditionDescription
 
- Protected Attributes inherited from itk::ObjectToObjectOptimizerBase
MeasureType m_CurrentMetricValue
 
MetricTypePointer m_Metric
 
ThreadIdType m_NumberOfThreads
 
ScalesType m_Scales
 
bool m_ScalesAreIdentity
 
ScalesType m_Weights
 
bool m_WeightsAreIdentity
 

Private Member Functions

 GradientDescentOptimizerv4 (const Self &)
 
void operator= (const Self &)
 

Private Attributes

bool m_DoEstimateLearningRateAtEachIteration
 
bool m_DoEstimateLearningRateOnce
 
bool m_DoEstimateScales
 

Additional Inherited Members

- Protected Types inherited from itk::GradientDescentOptimizerBasev4
typedef
GradientDescentOptimizerBasev4ModifyGradientByScalesThreader::IndexRangeType 
IndexRangeType
 

Member Typedef Documentation

Definition at line 91 of file itkGradientDescentOptimizerv4.h.

Type for the convergence checker

Definition at line 108 of file itkGradientDescentOptimizerv4.h.

typedef Superclass::DerivativeType itk::GradientDescentOptimizerv4::DerivativeType

Derivative type

Definition at line 97 of file itkGradientDescentOptimizerv4.h.

typedef Superclass::InternalComputationValueType itk::GradientDescentOptimizerv4::InternalComputationValueType

Definition at line 104 of file itkGradientDescentOptimizerv4.h.

typedef Superclass::MeasureType itk::GradientDescentOptimizerv4::MeasureType

Metric type over which this class is templated

Definition at line 103 of file itkGradientDescentOptimizerv4.h.

Definition at line 90 of file itkGradientDescentOptimizerv4.h.

Standard class typedefs.

Definition at line 88 of file itkGradientDescentOptimizerv4.h.

Definition at line 89 of file itkGradientDescentOptimizerv4.h.

Constructor & Destructor Documentation

itk::GradientDescentOptimizerv4::GradientDescentOptimizerv4 ( )
protected

Default constructor

virtual itk::GradientDescentOptimizerv4::~GradientDescentOptimizerv4 ( )
protectedvirtual

Destructor

itk::GradientDescentOptimizerv4::GradientDescentOptimizerv4 ( const Self )
private

Member Function Documentation

virtual void itk::GradientDescentOptimizerv4::AdvanceOneStep ( void  )
protectedvirtual

Advance one Step following the gradient direction. Includes transform update.

Reimplemented in itk::QuasiNewtonOptimizerv4, itk::GradientDescentLineSearchOptimizerv4, and itk::ConjugateGradientLineSearchOptimizerv4.

virtual::itk::LightObject::Pointer itk::GradientDescentOptimizerv4::CreateAnother ( void  ) const
virtual

Create an object from an instance, potentially deferring to a factory. This method allows you to create an instance of an object that is exactly the same type as the referring object. This is useful in cases where an object has been cast back to a base class.

Reimplemented from itk::Object.

Reimplemented in itk::QuasiNewtonOptimizerv4, and itk::MultiGradientOptimizerv4.

virtual void itk::GradientDescentOptimizerv4::DoEstimateLearningRateAtEachIterationOff ( )
virtual

Option to use ScalesEstimator for learning rate estimation at each iteration. The estimation overrides the learning rate set by SetLearningRate(). Default is false.

See Also
SetDoEstimateLearningRateOnce()
SetScalesEstimator()
virtual void itk::GradientDescentOptimizerv4::DoEstimateLearningRateAtEachIterationOn ( )
virtual

Option to use ScalesEstimator for learning rate estimation at each iteration. The estimation overrides the learning rate set by SetLearningRate(). Default is false.

See Also
SetDoEstimateLearningRateOnce()
SetScalesEstimator()
virtual void itk::GradientDescentOptimizerv4::DoEstimateLearningRateOnceOff ( )
virtual

Option to use ScalesEstimator for learning rate estimation only once, during first iteration. The estimation overrides the learning rate set by SetLearningRate(). Default is true.

See Also
SetDoEstimateLearningRateAtEachIteration()
SetScalesEstimator()
virtual void itk::GradientDescentOptimizerv4::DoEstimateLearningRateOnceOn ( )
virtual

Option to use ScalesEstimator for learning rate estimation only once, during first iteration. The estimation overrides the learning rate set by SetLearningRate(). Default is true.

See Also
SetDoEstimateLearningRateAtEachIteration()
SetScalesEstimator()
virtual void itk::GradientDescentOptimizerv4::DoEstimateScalesOff ( )
virtual

Option to use ScalesEstimator for scales estimation. The estimation is performed once at begin of optimization, and overrides any scales set using SetScales(). Default is true.

virtual void itk::GradientDescentOptimizerv4::DoEstimateScalesOn ( )
virtual

Option to use ScalesEstimator for scales estimation. The estimation is performed once at begin of optimization, and overrides any scales set using SetScales(). Default is true.

virtual void itk::GradientDescentOptimizerv4::EstimateLearningRate ( )
virtual

Estimate the learning rate based on the current gradient.

virtual const InternalComputationValueType& itk::GradientDescentOptimizerv4::GetConvergenceValue ( )
virtual

Get current convergence value

virtual const bool& itk::GradientDescentOptimizerv4::GetDoEstimateLearningRateAtEachIteration ( )
virtual

Option to use ScalesEstimator for learning rate estimation at each iteration. The estimation overrides the learning rate set by SetLearningRate(). Default is false.

See Also
SetDoEstimateLearningRateOnce()
SetScalesEstimator()
virtual const bool& itk::GradientDescentOptimizerv4::GetDoEstimateLearningRateOnce ( )
virtual

Option to use ScalesEstimator for learning rate estimation only once, during first iteration. The estimation overrides the learning rate set by SetLearningRate(). Default is true.

See Also
SetDoEstimateLearningRateAtEachIteration()
SetScalesEstimator()
virtual const bool& itk::GradientDescentOptimizerv4::GetDoEstimateScales ( )
virtual

Option to use ScalesEstimator for scales estimation. The estimation is performed once at begin of optimization, and overrides any scales set using SetScales(). Default is true.

virtual const InternalComputationValueType& itk::GradientDescentOptimizerv4::GetLearningRate ( )
virtual

Get the learning rate.

virtual const InternalComputationValueType& itk::GradientDescentOptimizerv4::GetMaximumStepSizeInPhysicalUnits ( )
virtual

Get the maximum step size, in physical space units.

virtual const char* itk::GradientDescentOptimizerv4::GetNameOfClass ( ) const
virtual
virtual const bool& itk::GradientDescentOptimizerv4::GetReturnBestParametersAndValue ( )
virtual

Flag. Set to have the optimizer track and return the best best metric value and corresponding best parameters that were calculated during the optimization. This captures the best solution when the optimizer oversteps or osciallates near the end of an optimization. Results are stored in m_CurrentMetricValue and in the assigned metric's parameters, retrievable via optimizer->GetCurrentPosition(). This option requires additional memory to store the best parameters, which can be large when working with high-dimensional transforms such as DisplacementFieldTransform.

virtual void itk::GradientDescentOptimizerv4::ModifyGradientByLearningRateOverSubRange ( const IndexRangeType subrange)
protectedvirtual

Modify the gradient over a given index range.

Implements itk::GradientDescentOptimizerBasev4.

virtual void itk::GradientDescentOptimizerv4::ModifyGradientByScalesOverSubRange ( const IndexRangeType subrange)
protectedvirtual

Modify the gradient over a given index range.

Implements itk::GradientDescentOptimizerBasev4.

static Pointer itk::GradientDescentOptimizerv4::New ( )
static

New macro for creation of through a Smart Pointer

void itk::GradientDescentOptimizerv4::operator= ( const Self )
private
virtual void itk::GradientDescentOptimizerv4::PrintSelf ( std::ostream &  os,
Indent  indent 
) const
protectedvirtual

Methods invoked by Print() to print information about the object including superclasses. Typically not called by the user (use Print() instead) but used in the hierarchical print process to combine the output of several classes.

Reimplemented from itk::GradientDescentOptimizerBasev4.

Reimplemented in itk::QuasiNewtonOptimizerv4, itk::MultiGradientOptimizerv4, itk::GradientDescentLineSearchOptimizerv4, and itk::ConjugateGradientLineSearchOptimizerv4.

virtual void itk::GradientDescentOptimizerv4::ResumeOptimization ( )
virtual

Resume optimization. This runs the optimization loop, and allows continuation of stopped optimization

Implements itk::GradientDescentOptimizerBasev4.

Reimplemented in itk::MultiGradientOptimizerv4.

virtual void itk::GradientDescentOptimizerv4::ReturnBestParametersAndValueOff ( )
virtual

Flag. Set to have the optimizer track and return the best best metric value and corresponding best parameters that were calculated during the optimization. This captures the best solution when the optimizer oversteps or osciallates near the end of an optimization. Results are stored in m_CurrentMetricValue and in the assigned metric's parameters, retrievable via optimizer->GetCurrentPosition(). This option requires additional memory to store the best parameters, which can be large when working with high-dimensional transforms such as DisplacementFieldTransform.

virtual void itk::GradientDescentOptimizerv4::ReturnBestParametersAndValueOn ( )
virtual

Flag. Set to have the optimizer track and return the best best metric value and corresponding best parameters that were calculated during the optimization. This captures the best solution when the optimizer oversteps or osciallates near the end of an optimization. Results are stored in m_CurrentMetricValue and in the assigned metric's parameters, retrievable via optimizer->GetCurrentPosition(). This option requires additional memory to store the best parameters, which can be large when working with high-dimensional transforms such as DisplacementFieldTransform.

virtual void itk::GradientDescentOptimizerv4::SetConvergenceWindowSize ( SizeValueType  _arg)
virtual

Window size for the convergence checker. The convergence checker calculates convergence value by fitting to a window of the energy (metric value) profile.

The default m_ConvergenceWindowSize is set to 50 to pass all tests. It is suggested to use 10 for less stringent convergence checking.

virtual void itk::GradientDescentOptimizerv4::SetDoEstimateLearningRateAtEachIteration ( bool  _arg)
virtual

Option to use ScalesEstimator for learning rate estimation at each iteration. The estimation overrides the learning rate set by SetLearningRate(). Default is false.

See Also
SetDoEstimateLearningRateOnce()
SetScalesEstimator()
virtual void itk::GradientDescentOptimizerv4::SetDoEstimateLearningRateOnce ( bool  _arg)
virtual

Option to use ScalesEstimator for learning rate estimation only once, during first iteration. The estimation overrides the learning rate set by SetLearningRate(). Default is true.

See Also
SetDoEstimateLearningRateAtEachIteration()
SetScalesEstimator()
virtual void itk::GradientDescentOptimizerv4::SetDoEstimateScales ( bool  _arg)
virtual

Option to use ScalesEstimator for scales estimation. The estimation is performed once at begin of optimization, and overrides any scales set using SetScales(). Default is true.

virtual void itk::GradientDescentOptimizerv4::SetLearningRate ( InternalComputationValueType  _arg)
virtual

Set the learning rate.

virtual void itk::GradientDescentOptimizerv4::SetMaximumStepSizeInPhysicalUnits ( InternalComputationValueType  _arg)
virtual

Set the maximum step size, in physical space units.

Only relevant when m_ScalesEstimator is set by user, and automatic learning rate estimation is enabled. See main documentation.

virtual void itk::GradientDescentOptimizerv4::SetMinimumConvergenceValue ( InternalComputationValueType  _arg)
virtual

Minimum convergence value for convergence checking. The convergence checker calculates convergence value by fitting to a window of the energy profile. When the convergence value reaches a small value, it would be treated as converged.

The default m_MinimumConvergenceValue is set to 1e-8 to pass all tests. It is suggested to use 1e-6 for less stringent convergence checking.

virtual void itk::GradientDescentOptimizerv4::SetReturnBestParametersAndValue ( bool  _arg)
virtual

Flag. Set to have the optimizer track and return the best best metric value and corresponding best parameters that were calculated during the optimization. This captures the best solution when the optimizer oversteps or osciallates near the end of an optimization. Results are stored in m_CurrentMetricValue and in the assigned metric's parameters, retrievable via optimizer->GetCurrentPosition(). This option requires additional memory to store the best parameters, which can be large when working with high-dimensional transforms such as DisplacementFieldTransform.

virtual void itk::GradientDescentOptimizerv4::SetScalesEstimator ( OptimizerParameterScalesEstimator _arg)
virtual

Set the scales estimator.

A ScalesEstimator is required for the scales and learning rate estimation options to work. See the main documentation.

See Also
SetDoEstimateScales()
SetDoEstimateLearningRateAtEachIteration()
SetDoEstimateLearningOnce()
virtual void itk::GradientDescentOptimizerv4::StartOptimization ( bool  doOnlyInitialization = false)
virtual
virtual void itk::GradientDescentOptimizerv4::StopOptimization ( void  )
virtual

Stop optimization. The object is left in a state so the optimization can be resumed by calling ResumeOptimization.

Reimplemented from itk::GradientDescentOptimizerBasev4.

Reimplemented in itk::MultiGradientOptimizerv4.

Member Data Documentation

ParametersType itk::GradientDescentOptimizerv4::m_BestParameters
protected

Definition at line 273 of file itkGradientDescentOptimizerv4.h.

ConvergenceMonitoringType::Pointer itk::GradientDescentOptimizerv4::m_ConvergenceMonitoring
protected

The convergence checker.

Definition at line 269 of file itkGradientDescentOptimizerv4.h.

InternalComputationValueType itk::GradientDescentOptimizerv4::m_ConvergenceValue
protected

Current convergence value.

Definition at line 266 of file itkGradientDescentOptimizerv4.h.

SizeValueType itk::GradientDescentOptimizerv4::m_ConvergenceWindowSize
protected

Window size for the convergence checker. The convergence checker calculates convergence value by fitting to a window of the energy (metric value) profile.

Definition at line 263 of file itkGradientDescentOptimizerv4.h.

MeasureType itk::GradientDescentOptimizerv4::m_CurrentBestValue
protected

Store the best value and related paramters

Definition at line 272 of file itkGradientDescentOptimizerv4.h.

bool itk::GradientDescentOptimizerv4::m_DoEstimateLearningRateAtEachIteration
private

Flag to control use of the ScalesEstimator (if set) for automatic learning step estimation at each iteration.

Definition at line 287 of file itkGradientDescentOptimizerv4.h.

bool itk::GradientDescentOptimizerv4::m_DoEstimateLearningRateOnce
private

Flag to control use of the ScalesEstimator (if set) for automatic learning step estimation only once, during first iteration.

Definition at line 292 of file itkGradientDescentOptimizerv4.h.

bool itk::GradientDescentOptimizerv4::m_DoEstimateScales
private

Flag to control use of the ScalesEstimator (if set) for automatic scale estimation during StartOptimization()

Definition at line 282 of file itkGradientDescentOptimizerv4.h.

InternalComputationValueType itk::GradientDescentOptimizerv4::m_LearningRate
protected

Manual learning rate to apply. It is overridden by automatic learning rate estimation if enabled. See main documentation.

Definition at line 235 of file itkGradientDescentOptimizerv4.h.

InternalComputationValueType itk::GradientDescentOptimizerv4::m_MaximumStepSizeInPhysicalUnits
protected

The maximum step size in physical units, to restrict learning rates. Only used with automatic learning rate estimation. See main documentation.

Definition at line 240 of file itkGradientDescentOptimizerv4.h.

InternalComputationValueType itk::GradientDescentOptimizerv4::m_MinimumConvergenceValue
protected

Minimum convergence value for convergence checking. The convergence checker calculates convergence value by fitting to a window of the energy profile. When the convergence value reaches a small value, such as 1e-8, it would be treated as converged.

Definition at line 257 of file itkGradientDescentOptimizerv4.h.

bool itk::GradientDescentOptimizerv4::m_ReturnBestParametersAndValue
protected

Flag to control returning of best value and parameters.

Definition at line 276 of file itkGradientDescentOptimizerv4.h.

OptimizerParameterScalesEstimator::Pointer itk::GradientDescentOptimizerv4::m_ScalesEstimator
protected

Definition at line 250 of file itkGradientDescentOptimizerv4.h.


The documentation for this class was generated from the following file: