[Insight-users] Gradient of the sum of square differences

Anja Ende anja.ende at googlemail.com
Sat Nov 13 20:57:32 EST 2010


Hello,

I have a simple question about the gradient of the sum of square
differences. The expression for this can be seen here in this paper
(equation 1)

http://books.google.co.uk/books?id=NlXMTARquPkC&pg=PA248&lpg=PA248&dq=gradient+of+sum+of+squared+difference&source=bl&ots=8z5dICJy7D&sig=8ly4QTVMrr_fHa_paOhSt6hbD8A&hl=en&ei=si7fTKPrKsaXhQfEr4yTDQ&sa=X&oi=book_result&ct=result&resnum=3&ved=0CB0Q6AEwAjge#v=onepage&q&f=false

So, I am just having trouble understanding the derivation and I was
wondering if one can help with it.

So, we are differentiating wrt to the displacement of voxels. If S(t)
is the transformed source image, and T is the target

SSD = sum (S(t) -T)^2

Now we need to compute the gradient wrt to the voxel displacements. So, we have:

grad(SSD) = grad(S(t) -T)^2.

= 2(S(t) - T) * grad(S(t)-T)

Now, in the second term, there is no displacement of the target
voxels, so that is 0. So we have:

= 2(S(t) -T) * grad(S(t))

Is the reasoning correct so far?

Now, the gradient in the second term is wrt the displacement of each
voxels. How would this be calculated? It surely would not just be the
spatial gradient of S(t)? That is what the papers seem to suggest.

Would really appreciate if someone could help me understand this.

Many thanks,

Anja


More information about the Insight-users mailing list