ITK  5.0.0
Insight Segmentation and Registration Toolkit
Public Types | Public Member Functions | Static Public Member Functions | Protected Member Functions | Protected Attributes | List of all members
itk::GradientDescentOptimizerv4Template< TInternalComputationValueType > Class Template Reference

#include <itkGradientDescentOptimizerv4.h>

+ Inheritance diagram for itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >:
+ Collaboration diagram for itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >:

Detailed Description

template<typename TInternalComputationValueType>
class itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >

Gradient descent optimizer.

GradientDescentOptimizer implements a simple gradient descent optimizer. At each iteration the current position is updated according to

\[ p_{n+1} = p_n + \mbox{learningRate} \, \frac{\partial f(p_n) }{\partial p_n} \]

Optionally, the best metric value and matching parameters can be stored and retried via GetValue() and GetCurrentPosition(). See SetReturnBestParametersAndValue().

Gradient scales can be manually set or automatically estimated, as documented in the base class. The learing rate defaults to 1.0, and can be set in two ways: 1) manually, via SetLearningRate(). Or, 2) automatically, either at each iteration or only at the first iteration, by assigning a ScalesEstimator via SetScalesEstimator(). When a ScalesEstimator is assigned, the optimizer is enabled by default to estimate learning rate only once, during the first iteration. This behavior can be changed via SetDoEstimateLearningRateAtEveryIteration() and SetDoEstimateLearningRateOnce(). For learning rate to be estimated at each iteration, the user must call SetDoEstimateLearningRateAtEveryIteration(true) and SetDoEstimateLearningRateOnce(false). When enabled, the optimizer computes learning rate(s) such that at each step, each voxel's change in physical space will be less than m_MaximumStepSizeInPhysicalUnits.

 m_LearningRate =
   m_MaximumStepSizeInPhysicalUnits /
   m_ScalesEstimator->EstimateStepScale(scaledGradient)

where m_MaximumStepSizeInPhysicalUnits defaults to the voxel spacing returned by m_ScalesEstimator->EstimateMaximumStepSize() (which is typically 1 voxel), and can be set by the user via SetMaximumStepSizeInPhysicalUnits(). When SetDoEstimateLearningRateOnce is enabled, the voxel change may become being greater than m_MaximumStepSizeInPhysicalUnits in later iterations.

Note
Unlike the previous version of GradientDescentOptimizer, this version does not have a "maximize/minimize" option to modify the effect of the metric derivative. The assigned metric is assumed to return a parameter derivative result that "improves" the optimization when added to the current parameters via the metric::UpdateTransformParameters method, after the optimizer applies scales and a learning rate.
Examples:
Examples/RegistrationITKv4/MultiStageImageRegistration1.cxx, and Examples/RegistrationITKv4/MultiStageImageRegistration2.cxx.

Definition at line 77 of file itkGradientDescentOptimizerv4.h.

Public Types

using ConstPointer = SmartPointer< const Self >
 
using DerivativeType = typename Superclass::DerivativeType
 
using IndexRangeType = typename Superclass::IndexRangeType
 
using InternalComputationValueType = TInternalComputationValueType
 
using MeasureType = typename Superclass::MeasureType
 
using ParametersType = typename Superclass::ParametersType
 
using Pointer = SmartPointer< Self >
 
using ScalesType = typename Superclass::ScalesType
 
using Self = GradientDescentOptimizerv4Template
 
using StopConditionType = typename Superclass::StopConditionType
 
using Superclass = GradientDescentOptimizerBasev4Template< TInternalComputationValueType >
 
- Public Types inherited from itk::GradientDescentOptimizerBasev4Template< TInternalComputationValueType >
using ConstPointer = SmartPointer< const Self >
 
using ConvergenceMonitoringType = itk::Function::WindowConvergenceMonitoringFunction< TInternalComputationValueType >
 
using DerivativeType = typename Superclass::DerivativeType
 
using IndexRangeType = ThreadedIndexedContainerPartitioner::IndexRangeType
 
using InternalComputationValueType = TInternalComputationValueType
 
using MeasureType = typename Superclass::MeasureType
 
using MetricType = typename Superclass::MetricType
 
using MetricTypePointer = typename MetricType::Pointer
 
using ParametersType = typename Superclass::ParametersType
 
using Pointer = SmartPointer< Self >
 
using ScalesType = typename Superclass::ScalesType
 
using Self = GradientDescentOptimizerBasev4Template
 
using StopConditionDescriptionType = typename Superclass::StopConditionDescriptionType
 
using StopConditionReturnStringType = typename Superclass::StopConditionReturnStringType
 
enum  StopConditionType {
  MAXIMUM_NUMBER_OF_ITERATIONS,
  COSTFUNCTION_ERROR,
  UPDATE_PARAMETERS_ERROR,
  STEP_TOO_SMALL,
  CONVERGENCE_CHECKER_PASSED,
  GRADIENT_MAGNITUDE_TOLEARANCE,
  OTHER_ERROR
}
 
using Superclass = ObjectToObjectOptimizerBaseTemplate< TInternalComputationValueType >
 
- Public Types inherited from itk::ObjectToObjectOptimizerBaseTemplate< TInternalComputationValueType >
using ConstPointer = SmartPointer< const Self >
 
using DerivativeType = typename MetricType::DerivativeType
 
using MeasureType = typename MetricType::MeasureType
 
using MetricType = ObjectToObjectMetricBaseTemplate< TInternalComputationValueType >
 
using MetricTypePointer = typename MetricType::Pointer
 
using NumberOfParametersType = typename MetricType::NumberOfParametersType
 
using ParametersType = OptimizerParameters< TInternalComputationValueType >
 
using Pointer = SmartPointer< Self >
 
using ScalesEstimatorType = OptimizerParameterScalesEstimatorTemplate< TInternalComputationValueType >
 
using ScalesType = OptimizerParameters< TInternalComputationValueType >
 
using Self = ObjectToObjectOptimizerBaseTemplate
 
using StopConditionDescriptionType = std::ostringstream
 
using StopConditionReturnStringType = std::string
 
using Superclass = Object
 
- Public Types inherited from itk::Object
using ConstPointer = SmartPointer< const Self >
 
using Pointer = SmartPointer< Self >
 
using Self = Object
 
using Superclass = LightObject
 
- Public Types inherited from itk::LightObject
using ConstPointer = SmartPointer< const Self >
 
using Pointer = SmartPointer< Self >
 
using Self = LightObject
 

Public Member Functions

virtual ::itk::LightObject::Pointer CreateAnother () const
 
virtual void EstimateLearningRate ()
 
virtual const
TInternalComputationValueType & 
GetConvergenceValue () const
 
virtual const char * GetNameOfClass () const
 
void ResumeOptimization () override
 
virtual void SetConvergenceWindowSize (SizeValueType _arg)
 
virtual void SetMinimumConvergenceValue (TInternalComputationValueType _arg)
 
void StartOptimization (bool doOnlyInitialization=false) override
 
void StopOptimization () override
 
virtual void SetLearningRate (TInternalComputationValueType _arg)
 
virtual const
TInternalComputationValueType & 
GetLearningRate () const
 
virtual void SetMaximumStepSizeInPhysicalUnits (TInternalComputationValueType _arg)
 
virtual const
TInternalComputationValueType & 
GetMaximumStepSizeInPhysicalUnits () const
 
virtual void SetDoEstimateLearningRateAtEachIteration (bool _arg)
 
virtual const bool & GetDoEstimateLearningRateAtEachIteration () const
 
virtual void DoEstimateLearningRateAtEachIterationOn ()
 
virtual void DoEstimateLearningRateAtEachIterationOff ()
 
virtual void SetDoEstimateLearningRateOnce (bool _arg)
 
virtual const bool & GetDoEstimateLearningRateOnce () const
 
virtual void DoEstimateLearningRateOnceOn ()
 
virtual void DoEstimateLearningRateOnceOff ()
 
virtual void SetReturnBestParametersAndValue (bool _arg)
 
virtual const bool & GetReturnBestParametersAndValue () const
 
virtual void ReturnBestParametersAndValueOn ()
 
virtual void ReturnBestParametersAndValueOff ()
 
- Public Member Functions inherited from itk::GradientDescentOptimizerBasev4Template< TInternalComputationValueType >
SizeValueType GetCurrentIteration () const override
 
virtual const DerivativeTypeGetGradient () const
 
SizeValueType GetNumberOfIterations () const override
 
virtual const StopConditionTypeGetStopCondition () const
 
const StopConditionReturnStringType GetStopConditionDescription () const override
 
void SetNumberOfIterations (const SizeValueType numberOfIterations) override
 
virtual void ModifyGradientByScales ()
 
virtual void ModifyGradientByLearningRate ()
 
- Public Member Functions inherited from itk::ObjectToObjectOptimizerBaseTemplate< TInternalComputationValueType >
virtual const MeasureTypeGetCurrentMetricValue () const
 
virtual const ParametersTypeGetCurrentPosition () const
 
virtual const ThreadIdTypeGetNumberOfWorkUnits () const
 
virtual const ScalesTypeGetScales () const
 
virtual const bool & GetScalesAreIdentity () const
 
bool GetScalesInitialized () const
 
virtual const MeasureTypeGetValue () const
 
virtual const ScalesTypeGetWeights () const
 
virtual const bool & GetWeightsAreIdentity () const
 
virtual void SetNumberOfWorkUnits (ThreadIdType number)
 
virtual void SetScalesEstimator (ScalesEstimatorType *_arg)
 
virtual void SetWeights (ScalesType _arg)
 
virtual void SetMetric (MetricType *_arg)
 
virtual MetricTypeGetModifiableMetric ()
 
virtual const MetricTypeGetMetric () const
 
virtual void SetScales (const ScalesType &scales)
 
virtual void SetDoEstimateScales (bool _arg)
 
virtual const bool & GetDoEstimateScales () const
 
virtual void DoEstimateScalesOn ()
 
virtual void DoEstimateScalesOff ()
 
- Public Member Functions inherited from itk::Object
unsigned long AddObserver (const EventObject &event, Command *)
 
unsigned long AddObserver (const EventObject &event, Command *) const
 
virtual void DebugOff () const
 
virtual void DebugOn () const
 
CommandGetCommand (unsigned long tag)
 
bool GetDebug () const
 
MetaDataDictionaryGetMetaDataDictionary ()
 
const MetaDataDictionaryGetMetaDataDictionary () const
 
virtual ModifiedTimeType GetMTime () const
 
virtual const TimeStampGetTimeStamp () const
 
bool HasObserver (const EventObject &event) const
 
void InvokeEvent (const EventObject &)
 
void InvokeEvent (const EventObject &) const
 
virtual void Modified () const
 
void Register () const override
 
void RemoveAllObservers ()
 
void RemoveObserver (unsigned long tag)
 
void SetDebug (bool debugFlag) const
 
void SetReferenceCount (int) override
 
void UnRegister () const noexceptoverride
 
void SetMetaDataDictionary (const MetaDataDictionary &rhs)
 
void SetMetaDataDictionary (MetaDataDictionary &&rrhs)
 
virtual void SetObjectName (std::string _arg)
 
virtual const std::string & GetObjectName () const
 
- Public Member Functions inherited from itk::LightObject
virtual void Delete ()
 
virtual int GetReferenceCount () const
 
 itkCloneMacro (Self)
 
void Print (std::ostream &os, Indent indent=0) const
 

Static Public Member Functions

static Pointer New ()
 
- Static Public Member Functions inherited from itk::Object
static bool GetGlobalWarningDisplay ()
 
static void GlobalWarningDisplayOff ()
 
static void GlobalWarningDisplayOn ()
 
static Pointer New ()
 
static void SetGlobalWarningDisplay (bool flag)
 
- Static Public Member Functions inherited from itk::LightObject
static void BreakOnError ()
 
static Pointer New ()
 

Protected Member Functions

virtual void AdvanceOneStep ()
 
 GradientDescentOptimizerv4Template ()
 
void ModifyGradientByLearningRateOverSubRange (const IndexRangeType &subrange) override
 
void ModifyGradientByScalesOverSubRange (const IndexRangeType &subrange) override
 
void PrintSelf (std::ostream &os, Indent indent) const override
 
 ~GradientDescentOptimizerv4Template () override=default
 
- Protected Member Functions inherited from itk::GradientDescentOptimizerBasev4Template< TInternalComputationValueType >
 GradientDescentOptimizerBasev4Template ()
 
 ~GradientDescentOptimizerBasev4Template () override=default
 
- Protected Member Functions inherited from itk::ObjectToObjectOptimizerBaseTemplate< TInternalComputationValueType >
void PrintSelf (std::ostream &os, Indent indent) const override
 
 ObjectToObjectOptimizerBaseTemplate ()
 
 ~ObjectToObjectOptimizerBaseTemplate () override
 
- Protected Member Functions inherited from itk::Object
 Object ()
 
bool PrintObservers (std::ostream &os, Indent indent) const
 
virtual void SetTimeStamp (const TimeStamp &time)
 
 ~Object () override
 
- Protected Member Functions inherited from itk::LightObject
virtual LightObject::Pointer InternalClone () const
 
 LightObject ()
 
virtual void PrintHeader (std::ostream &os, Indent indent) const
 
virtual void PrintTrailer (std::ostream &os, Indent indent) const
 
virtual ~LightObject ()
 

Protected Attributes

ParametersType m_BestParameters
 
TInternalComputationValueType m_ConvergenceValue
 
MeasureType m_CurrentBestValue
 
TInternalComputationValueType m_LearningRate
 
TInternalComputationValueType m_MinimumConvergenceValue
 
DerivativeType m_PreviousGradient
 
bool m_ReturnBestParametersAndValue { false }
 
- Protected Attributes inherited from itk::GradientDescentOptimizerBasev4Template< TInternalComputationValueType >
ConvergenceMonitoringType::Pointer m_ConvergenceMonitoring
 
SizeValueType m_ConvergenceWindowSize
 
bool m_DoEstimateLearningRateAtEachIteration
 
bool m_DoEstimateLearningRateOnce
 
DerivativeType m_Gradient
 
TInternalComputationValueType m_MaximumStepSizeInPhysicalUnits
 
DomainThreader
< ThreadedIndexedContainerPartitioner,
Self >::Pointer 
m_ModifyGradientByLearningRateThreader
 
DomainThreader
< ThreadedIndexedContainerPartitioner,
Self >::Pointer 
m_ModifyGradientByScalesThreader
 
bool m_Stop {false}
 
StopConditionType m_StopCondition
 
StopConditionDescriptionType m_StopConditionDescription
 
bool m_UseConvergenceMonitoring
 
- Protected Attributes inherited from itk::ObjectToObjectOptimizerBaseTemplate< TInternalComputationValueType >
SizeValueType m_CurrentIteration
 
MeasureType m_CurrentMetricValue
 
bool m_DoEstimateScales
 
MetricTypePointer m_Metric
 
SizeValueType m_NumberOfIterations
 
ThreadIdType m_NumberOfWorkUnits
 
ScalesType m_Scales
 
bool m_ScalesAreIdentity
 
ScalesEstimatorType::Pointer m_ScalesEstimator
 
ScalesType m_Weights
 
bool m_WeightsAreIdentity
 
- Protected Attributes inherited from itk::LightObject
std::atomic< int > m_ReferenceCount
 

Member Typedef Documentation

template<typename TInternalComputationValueType>
using itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::ConstPointer = SmartPointer< const Self >

Definition at line 87 of file itkGradientDescentOptimizerv4.h.

template<typename TInternalComputationValueType>
using itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::DerivativeType = typename Superclass::DerivativeType

Derivative type

Definition at line 100 of file itkGradientDescentOptimizerv4.h.

template<typename TInternalComputationValueType>
using itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::IndexRangeType = typename Superclass::IndexRangeType

Definition at line 104 of file itkGradientDescentOptimizerv4.h.

template<typename TInternalComputationValueType>
using itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::InternalComputationValueType = TInternalComputationValueType

It should be possible to derive the internal computation type from the class object.

Definition at line 97 of file itkGradientDescentOptimizerv4.h.

template<typename TInternalComputationValueType>
using itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::MeasureType = typename Superclass::MeasureType

Metric type over which this class is templated

Definition at line 103 of file itkGradientDescentOptimizerv4.h.

template<typename TInternalComputationValueType>
using itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::ParametersType = typename Superclass::ParametersType

Definition at line 106 of file itkGradientDescentOptimizerv4.h.

template<typename TInternalComputationValueType>
using itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::Pointer = SmartPointer< Self >

Definition at line 86 of file itkGradientDescentOptimizerv4.h.

template<typename TInternalComputationValueType>
using itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::ScalesType = typename Superclass::ScalesType

Definition at line 105 of file itkGradientDescentOptimizerv4.h.

template<typename TInternalComputationValueType>
using itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::Self = GradientDescentOptimizerv4Template

Standard class type aliases.

Definition at line 84 of file itkGradientDescentOptimizerv4.h.

template<typename TInternalComputationValueType>
using itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::StopConditionType = typename Superclass::StopConditionType

Definition at line 107 of file itkGradientDescentOptimizerv4.h.

template<typename TInternalComputationValueType>
using itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::Superclass = GradientDescentOptimizerBasev4Template<TInternalComputationValueType>

Definition at line 85 of file itkGradientDescentOptimizerv4.h.

Constructor & Destructor Documentation

template<typename TInternalComputationValueType>
itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::GradientDescentOptimizerv4Template ( )
protected

Default constructor

template<typename TInternalComputationValueType>
itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::~GradientDescentOptimizerv4Template ( )
overrideprotecteddefault

Destructor

Member Function Documentation

template<typename TInternalComputationValueType>
virtual void itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::AdvanceOneStep ( )
protectedvirtual
template<typename TInternalComputationValueType>
virtual::itk::LightObject::Pointer itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::CreateAnother ( ) const
virtual

Create an object from an instance, potentially deferring to a factory. This method allows you to create an instance of an object that is exactly the same type as the referring object. This is useful in cases where an object has been cast back to a base class.

Reimplemented from itk::Object.

Reimplemented in itk::QuasiNewtonOptimizerv4Template< TInternalComputationValueType >, itk::MultiGradientOptimizerv4Template< TInternalComputationValueType >, and itk::RegularStepGradientDescentOptimizerv4< TInternalComputationValueType >.

template<typename TInternalComputationValueType>
virtual void itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::DoEstimateLearningRateAtEachIterationOff ( )
virtual

Option to use ScalesEstimator for learning rate estimation at each iteration. The estimation overrides the learning rate set by SetLearningRate(). Default is false.

See Also
SetDoEstimateLearningRateOnce()
SetScalesEstimator()
template<typename TInternalComputationValueType>
virtual void itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::DoEstimateLearningRateAtEachIterationOn ( )
virtual

Option to use ScalesEstimator for learning rate estimation at each iteration. The estimation overrides the learning rate set by SetLearningRate(). Default is false.

See Also
SetDoEstimateLearningRateOnce()
SetScalesEstimator()
template<typename TInternalComputationValueType>
virtual void itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::DoEstimateLearningRateOnceOff ( )
virtual

Option to use ScalesEstimator for learning rate estimation only once, during first iteration. The estimation overrides the learning rate set by SetLearningRate(). Default is true.

See Also
SetDoEstimateLearningRateAtEachIteration()
SetScalesEstimator()
template<typename TInternalComputationValueType>
virtual void itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::DoEstimateLearningRateOnceOn ( )
virtual

Option to use ScalesEstimator for learning rate estimation only once, during first iteration. The estimation overrides the learning rate set by SetLearningRate(). Default is true.

See Also
SetDoEstimateLearningRateAtEachIteration()
SetScalesEstimator()
template<typename TInternalComputationValueType>
virtual void itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::EstimateLearningRate ( )
virtual

Estimate the learning rate based on the current gradient.

Reimplemented in itk::RegularStepGradientDescentOptimizerv4< TInternalComputationValueType >.

template<typename TInternalComputationValueType>
virtual const TInternalComputationValueType& itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::GetConvergenceValue ( ) const
virtual

Get current convergence value. WindowConvergenceMonitoringFunction always returns output convergence value in 'TInternalComputationValueType' precision.

template<typename TInternalComputationValueType>
virtual const bool& itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::GetDoEstimateLearningRateAtEachIteration ( ) const
virtual

Option to use ScalesEstimator for learning rate estimation at each iteration. The estimation overrides the learning rate set by SetLearningRate(). Default is false.

See Also
SetDoEstimateLearningRateOnce()
SetScalesEstimator()
template<typename TInternalComputationValueType>
virtual const bool& itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::GetDoEstimateLearningRateOnce ( ) const
virtual

Option to use ScalesEstimator for learning rate estimation only once, during first iteration. The estimation overrides the learning rate set by SetLearningRate(). Default is true.

See Also
SetDoEstimateLearningRateAtEachIteration()
SetScalesEstimator()
template<typename TInternalComputationValueType>
virtual const TInternalComputationValueType& itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::GetLearningRate ( ) const
virtual

Set/Get the learning rate to apply. It is overridden by automatic learning rate estimation if enabled. See main documentation.

template<typename TInternalComputationValueType>
virtual const TInternalComputationValueType& itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::GetMaximumStepSizeInPhysicalUnits ( ) const
virtual

Set/Get the maximum step size, in physical space units.

 Only relevant when m_ScalesEstimator is set by user,
 and automatic learning rate estimation is enabled.
 See main documentation.
template<typename TInternalComputationValueType>
virtual const char* itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::GetNameOfClass ( ) const
virtual
template<typename TInternalComputationValueType>
virtual const bool& itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::GetReturnBestParametersAndValue ( ) const
virtual

Flag. Set to have the optimizer track and return the best best metric value and corresponding best parameters that were calculated during the optimization. This captures the best solution when the optimizer oversteps or osciallates near the end of an optimization. Results are stored in m_CurrentMetricValue and in the assigned metric's parameters, retrievable via optimizer->GetCurrentPosition(). This option requires additional memory to store the best parameters, which can be large when working with high-dimensional transforms such as DisplacementFieldTransform.

template<typename TInternalComputationValueType>
void itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::ModifyGradientByLearningRateOverSubRange ( const IndexRangeType subrange)
overrideprotectedvirtual
template<typename TInternalComputationValueType>
void itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::ModifyGradientByScalesOverSubRange ( const IndexRangeType subrange)
overrideprotectedvirtual

Modify the gradient by scales and weights over a given index range.

Implements itk::GradientDescentOptimizerBasev4Template< TInternalComputationValueType >.

Reimplemented in itk::RegularStepGradientDescentOptimizerv4< TInternalComputationValueType >.

template<typename TInternalComputationValueType>
static Pointer itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::New ( )
static

New macro for creation of through a Smart Pointer

template<typename TInternalComputationValueType>
void itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::PrintSelf ( std::ostream &  os,
Indent  indent 
) const
overrideprotectedvirtual

Methods invoked by Print() to print information about the object including superclasses. Typically not called by the user (use Print() instead) but used in the hierarchical print process to combine the output of several classes.

Reimplemented from itk::GradientDescentOptimizerBasev4Template< TInternalComputationValueType >.

Reimplemented in itk::QuasiNewtonOptimizerv4Template< TInternalComputationValueType >, itk::RegularStepGradientDescentOptimizerv4< TInternalComputationValueType >, and itk::MultiGradientOptimizerv4Template< TInternalComputationValueType >.

template<typename TInternalComputationValueType>
void itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::ResumeOptimization ( )
overridevirtual
template<typename TInternalComputationValueType>
virtual void itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::ReturnBestParametersAndValueOff ( )
virtual

Flag. Set to have the optimizer track and return the best best metric value and corresponding best parameters that were calculated during the optimization. This captures the best solution when the optimizer oversteps or osciallates near the end of an optimization. Results are stored in m_CurrentMetricValue and in the assigned metric's parameters, retrievable via optimizer->GetCurrentPosition(). This option requires additional memory to store the best parameters, which can be large when working with high-dimensional transforms such as DisplacementFieldTransform.

template<typename TInternalComputationValueType>
virtual void itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::ReturnBestParametersAndValueOn ( )
virtual

Flag. Set to have the optimizer track and return the best best metric value and corresponding best parameters that were calculated during the optimization. This captures the best solution when the optimizer oversteps or osciallates near the end of an optimization. Results are stored in m_CurrentMetricValue and in the assigned metric's parameters, retrievable via optimizer->GetCurrentPosition(). This option requires additional memory to store the best parameters, which can be large when working with high-dimensional transforms such as DisplacementFieldTransform.

template<typename TInternalComputationValueType>
virtual void itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::SetConvergenceWindowSize ( SizeValueType  _arg)
virtual

Window size for the convergence checker. The convergence checker calculates convergence value by fitting to a window of the energy (metric value) profile.

The default m_ConvergenceWindowSize is set to 50 to pass all tests. It is suggested to use 10 for less stringent convergence checking.

template<typename TInternalComputationValueType>
virtual void itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::SetDoEstimateLearningRateAtEachIteration ( bool  _arg)
virtual

Option to use ScalesEstimator for learning rate estimation at each iteration. The estimation overrides the learning rate set by SetLearningRate(). Default is false.

See Also
SetDoEstimateLearningRateOnce()
SetScalesEstimator()
template<typename TInternalComputationValueType>
virtual void itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::SetDoEstimateLearningRateOnce ( bool  _arg)
virtual

Option to use ScalesEstimator for learning rate estimation only once, during first iteration. The estimation overrides the learning rate set by SetLearningRate(). Default is true.

See Also
SetDoEstimateLearningRateAtEachIteration()
SetScalesEstimator()
template<typename TInternalComputationValueType>
virtual void itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::SetLearningRate ( TInternalComputationValueType  _arg)
virtual

Set/Get the learning rate to apply. It is overridden by automatic learning rate estimation if enabled. See main documentation.

template<typename TInternalComputationValueType>
virtual void itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::SetMaximumStepSizeInPhysicalUnits ( TInternalComputationValueType  _arg)
virtual

Set/Get the maximum step size, in physical space units.

 Only relevant when m_ScalesEstimator is set by user,
 and automatic learning rate estimation is enabled.
 See main documentation.
template<typename TInternalComputationValueType>
virtual void itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::SetMinimumConvergenceValue ( TInternalComputationValueType  _arg)
virtual

Minimum convergence value for convergence checking. The convergence checker calculates convergence value by fitting to a window of the energy profile. When the convergence value reaches a small value, it would be treated as converged.

The default m_MinimumConvergenceValue is set to 1e-8 to pass all tests. It is suggested to use 1e-6 for less stringent convergence checking.

template<typename TInternalComputationValueType>
virtual void itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::SetReturnBestParametersAndValue ( bool  _arg)
virtual

Flag. Set to have the optimizer track and return the best best metric value and corresponding best parameters that were calculated during the optimization. This captures the best solution when the optimizer oversteps or osciallates near the end of an optimization. Results are stored in m_CurrentMetricValue and in the assigned metric's parameters, retrievable via optimizer->GetCurrentPosition(). This option requires additional memory to store the best parameters, which can be large when working with high-dimensional transforms such as DisplacementFieldTransform.

template<typename TInternalComputationValueType>
void itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::StartOptimization ( bool  doOnlyInitialization = false)
overridevirtual
template<typename TInternalComputationValueType>
void itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::StopOptimization ( )
overridevirtual

Member Data Documentation

template<typename TInternalComputationValueType>
ParametersType itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::m_BestParameters
protected

Definition at line 231 of file itkGradientDescentOptimizerv4.h.

template<typename TInternalComputationValueType>
TInternalComputationValueType itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::m_ConvergenceValue
protected

Definition at line 227 of file itkGradientDescentOptimizerv4.h.

template<typename TInternalComputationValueType>
MeasureType itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::m_CurrentBestValue
protected

Store the best value and related parameters.

Definition at line 230 of file itkGradientDescentOptimizerv4.h.

template<typename TInternalComputationValueType>
TInternalComputationValueType itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::m_LearningRate
protected

Definition at line 225 of file itkGradientDescentOptimizerv4.h.

template<typename TInternalComputationValueType>
TInternalComputationValueType itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::m_MinimumConvergenceValue
protected

Definition at line 226 of file itkGradientDescentOptimizerv4.h.

template<typename TInternalComputationValueType>
DerivativeType itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::m_PreviousGradient
protected

Store the previous gradient value at each iteration, so we can detect the changes in gradient direction. This is needed by the regular step gradient descent and Quasi Newton optimizers.

Definition at line 240 of file itkGradientDescentOptimizerv4.h.

template<typename TInternalComputationValueType>
bool itk::GradientDescentOptimizerv4Template< TInternalComputationValueType >::m_ReturnBestParametersAndValue { false }
protected

Definition at line 233 of file itkGradientDescentOptimizerv4.h.


The documentation for this class was generated from the following file: