[Insight-users] Small problem using LBFGSB

Tom Vercauteren tom.vercauteren at m4x.org
Wed Apr 30 05:01:01 EDT 2008


Hi Rupert,

Thanks for the information.

> I've messed around with the LBFGSB optimizer a fair bit.  Usually its
> pretty reliable, but I've seen it occasionally make a wild jump during
> its line search which sends the parameters completely outside any sort
> of appropriate range.  This happened when i was using a function that
> had an approximate gradient and what seemed to be happening is the the
> gradient and the function not truly matching up basically made the
> line search divide by a very small number, leading to a very large,
> erroneous step.

I do find it quite reliable too. The problem I have seems very close
to what you were experiencing. In a few cases, the line search gets
trapped and then some wild step comes out...

> The line search makes a cubic polynomial fit to the function, and
> polynomial fits can be sensitive to erroneous data.  I got around the
> problem by making my gradient more well behaved.

I understand that, but how did you manage to make your gradient better
behaved? Similar to many ITK applications, I use an image similarity
metric with linear interpolation for the image interpolation and
nearest-neighbor interpolation for the image gradient interpolation. I
tried using both a smoothed gradient image and a central difference
gradient image to compute the gradient of my cost function but I still
have the above problem in some cases. Maybe I should try using  linear
interpolation also for the image gradient interpolation...

> I also noticed your comments about scaling on the other VNL lbfgs
> optimizer the other day.  I'd be very supportive of having a general
> scaling framework for the ITK optimizers.  As it is now, the scale
> factors have vastly different effects between say, Powell and
> RegularStepGradientDescent.  I have a big nasty switch statement in my
> code to manage it.

Thanks for the support ;)

> I'd be curious about what you're working on for optimizer projects.
> I've been working on a number of optimizer related projects in ITK.  I
> have developed some Hessian based optimizers, and a trust-region
> version of the gradient descent which im kind of fond of.

Hessian-based optimizers is definitely something I would like to
explore. Especially, I would strongly argue to get Gauss-Newton like
optimizers (e.g. Levenberg-Marquardt, Powell's dog leg, ESM, etc.) to
work with least-squares like image similarity criteria in ITK (Mean
squared error, cross-correlation, etc.). These are not strictly
speaking Hessian-based optimizers but can be seen as
pseudo-Hessian-based and could be developed in an Hessian-based
optimizer API.

Problem is I don't have enough time to do all that...

Best,
Tom


More information about the Insight-users mailing list