[Insight-users] Regular Step Gradient Descent optimizer

Kevin H. Hobbs kevin.hobbs.1 at ohiou.edu
Wed May 3 14:48:09 EDT 2006


On Wed, 2006-05-03 at 10:34 -0700, Ming Chao wrote:
> Hi Luis,
>  
> Actually my original question was quite simple. I didn't intend to
> compare A with B to decide which one is better. What concerned me is
> why when I used itkRegularStepGradientDescentOptimizer the metric
> values first decreased and then increased (isn't this abnormal?).
> While using LBFGSBOptimizer the metric values kept decreasing and
> stopped when the optimizer thought it reached mimimum. Unless I missed
> something, I haven't understood why.
>  
> Cheers,
> Ming
>  
> 

I don't think this is so odd. Try making the observer print the size of
the optimizer, if it has one.  I think what you're seeing is that the
optimizer finds a minimum and then looks around that minimum. The points
around the minimum will be worse than the minimum, duh, so you'll see an
increase. The size of the steps will be shrinking though. The last value
that the observer spits out is not necessarily the best one.  In fact I
think it must not be...

At least I've seen this before...
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 189 bytes
Desc: This is a digitally signed message part
Url : http://public.kitware.com/pipermail/insight-users/attachments/20060503/98a014a0/attachment.pgp


More information about the Insight-users mailing list