[Insight-users] Gradient of the sum of square differences

Anja Ende anja.ende at googlemail.com
Sun Nov 14 08:28:01 EST 2010


Just one follow-up question:

So, the gradient of the SSD is given by:

D(SSD)/Du(t) = 2*(S(u(t)) - T) * {D(S(u(t))/Du(t)}

Now, for the second term, say I want to compute an analytical
derivative, so that my optimisation is more accurate. The way I have
resampled my image is through an interpolating spline. I know people
have done it, but how would one take that into account when
calculating such an analytical derivative.

Thanks,
Anja




On 14 November 2010 10:39, Anja Ende <anja.ende at googlemail.com> wrote:
> Very helpful and clear explanation.
>
> Many thanks!
>
> On 14 November 2010 04:49, David R. Haynor <haynor at u.washington.edu> wrote:
>> hi anja,
>>
>> you've oversimplified a bit, which makes things confusing.
>>
>> assume that fixed coordinate systems underlie S and T; they are presumably euclidean, but don't have to have the
>> same domain or even dimension.  call these systems "S-space" and "T-space".
>>
>> call u(t) the map (displacement is a good term, but confusing unless the domains are the same) that maps a coordinate location t in T-space to the candidate corresponding location in S-space, so that T(t) should be close to S(u(t)).
>>
>> then SSD = sum-over-t (S(u(t) - T(t)) ^ 2.
>>
>> note that SSD is a function of u(), not t.  as a function of a function, it could be called a functional.  at any rate, it has a derivative (the "Gateaux derivative") with respect to u(), which is (in the usual discretization where u is just identified with its grid values) also a vector, of the same size as u and of the T-space grid where u is defined.  the value of the derivative at a point t (think of it as a partial derivative of SSD with respect to u(t)) only involves one term (the t'th) in the sum.  this derivative is
>>
>> D(SSD)/Du(t) = 2*(S(u(t)) - T) * {D(S(u(t))/Du(t)},
>>
>> since T(t) doesn't depend on u, and this last term in curly braces is exactly the gradient of S, evaluated at the point u(t) in S-space.
>>
>> in many cases (other than demons) the map u() is parameterized (for example, it might be a rigid body rotation): u(t) = u(t, A).  in that case, SSD becomes a function of the parameter(s).  if A is one of those parameters, one is interested in the derivative of SSD with respect to A.  A influences all of the terms in the sum-over-t, so we get a formula like
>>
>> D(SSD)/D(A) = sum-over-t [D(SSD)/Du(t, A) * Du(t, A)/DA ],
>>
>> where the first term in the brackets is given by the expression just above and the other term is just the conventional partial derivative of u(t, A) with respect to A, evaluated at the point (t, A).
>>
>> HTH.
>>
>> -dh
>> David Haynor
>> University of Washington
>>
>>
>>
>>
>> On Sun, 14 Nov 2010, Anja Ende wrote:
>>
>>> Hello,
>>>
>>> I have a simple question about the gradient of the sum of square
>>> differences. The expression for this can be seen here in this paper
>>> (equation 1)
>>>
>>> http://books.google.co.uk/books?id=NlXMTARquPkC&pg=PA248&lpg=PA248&dq=gradient+of+sum+of+squared+difference&source=bl&ots=8z5dICJy7D&sig=8ly4QTVMrr_fHa_paOhSt6hbD8A&hl=en&ei=si7fTKPrKsaXhQfEr4yTDQ&sa=X&oi=book_result&ct=result&resnum=3&ved=0CB0Q6AEwAjge#v=onepage&q&f=false
>>>
>>> So, I am just having trouble understanding the derivation and I was
>>> wondering if one can help with it.
>>>
>>> So, we are differentiating wrt to the displacement of voxels. If S(t)
>>> is the transformed source image, and T is the target
>>>
>>> SSD = sum (S(t) -T)^2
>>>
>>> Now we need to compute the gradient wrt to the voxel displacements. So, we have:
>>>
>>> grad(SSD) = grad(S(t) -T)^2.
>>>
>>> = 2(S(t) - T) * grad(S(t)-T)
>>>
>>> Now, in the second term, there is no displacement of the target
>>> voxels, so that is 0. So we have:
>>>
>>> = 2(S(t) -T) * grad(S(t))
>>>
>>> Is the reasoning correct so far?
>>>
>>> Now, the gradient in the second term is wrt the displacement of each
>>> voxels. How would this be calculated? It surely would not just be the
>>> spatial gradient of S(t)? That is what the papers seem to suggest.
>>>
>>> Would really appreciate if someone could help me understand this.
>>>
>>> Many thanks,
>>>
>>> Anja
>>> _____________________________________
>>> Powered by www.kitware.com
>>>
>>> Visit other Kitware open-source projects at
>>> http://www.kitware.com/opensource/opensource.html
>>>
>>> Kitware offers ITK Training Courses, for more information visit:
>>> http://www.kitware.com/products/protraining.html
>>>
>>> Please keep messages on-topic and check the ITK FAQ at:
>>> http://www.itk.org/Wiki/ITK_FAQ
>>>
>>> Follow this link to subscribe/unsubscribe:
>>> http://www.itk.org/mailman/listinfo/insight-users
>>>
>>
>>
>>
>
>
>
> --
> Cheers,
>
> Anja
>



-- 
Cheers,

Anja


More information about the Insight-users mailing list