[Insight-users] Quaternions and Versors

Luis Ibanez luis.ibanez at kitware.com
Wed Jun 18 16:28:53 EDT 2008


Hi Rupert,


Thanks a lot for the detailed comments.


You seem to be right indeed.


The composition of Versors:

 >   VersorType newRotation = currentRotation * gradientRotation;


Should have composed the Versor from the left like:

 >   VersorType newRotation =  gradientRotation * currentRotation;


I'll run an experimental build with the change,
and if it goes well, I'll commit the change.




     Thanks


        Luis



---------------------
Rupert Brooks wrote:
> Hi Tom,  Luis,
> 
> This is an interesting conversation.  It led me to spend the morning
> scribbling quaternions in my notebook :-)
> 
> I worked through a lot of the QuaternionsII.pdf, and theres one thing
> left that troubles me.  Versor multiplication (ie composition of 3D
> rotations) is non-commutative.  In the notes there is
> q_(n+1)=(p/q_(n))^a * q_(n)
> 
> where q_(n+1) and q_(n) are the new, and the current versors, and p/q
> is the update step.  This agrees with my intuition.  We have a current
> position in the parameter space, q, and now we are taking an
> additional step.  However, in the code there is
> 
> //
>   // Composing the currentRotation with the gradientRotation
>   // produces the new Rotation versor
>   //
>   VersorType newRotation = currentRotation * gradientRotation;
> 
> To me, this composition seems to be in the wrong order.  (I did a
> quick check, and the versors compose in the same order their assocated
> rotation matrices do.)  Did i miss something?
> 
> 
>>In order to minimize with respect to q, we form the differential
>>according to all the rules in Hamilton's book:
>>
>>  dfq =  df(q,dq) = dq . v . q^(-1)  -  q . v . q^(-1) . dq . q^(-1)
>>
>>So how do I get from this expression to the Jacobian implemented in the
>>itk::VersorTransform?
> 
> 
> I believe you can get there by expressing the fourth component of the
> quaternion as a function of the first three, writing out the matrix
> representation, and taking the derivative in the usual way.  I did
> this once, and i convinced myself that the Jacobian was correct.  This
> considers the function M(q) which generates the rotation matrix from
> the quaternion q.  Then you can write
> 
> Xnew=M(q) x Xold+T
> 
> M(q) can be written in closed form, and the derivative taken in the
> usual way.  However, you could write
> 
> Xnew=M(q) x M(dq) x Xold +T
> 
> and take the derivative with respect to dq.  This would be truly a
> compositional step - and because the derivative of dq would be taken
> around zero, it would be valid everywhere.  (At zero rotation, the w
> component is at a maximum, so its derivative is zero - only the x,y,z
> components will have derivative values)
> 
> 
>>>A.3) "Is there any literature on the subject from this century,
>>>      preferably focusing on the given problem of image registration?"
>>>
>>>
>>>     Modern books on Quaternions are actually of very inferior
>>>     quality to Hamilton's one. They are more focused on the
>>>     notation details, and are usually handicaped by the obsesive
>>>     compulsion for describing quaterions as "even harder than
>>>     complex numbers".
> 
> 
> I have not read Hamiltons original work, so i cannot comment, but i
> found Andrew Hansens "Visualizing Quaternions" to be a helpful book.
> 
> Another way to think about it is that the space of versors / 3D
> rotations is a Lie group.  The derivative at a point is an element of
> its tangent space - that is, it is a logarithm of a rotation.  To move
> along a geodesic of the group, we move along the exponential map
> induced by this tangent.  This is another way to get to the
> exponentiation in the update step.
> 
> q_(n+1)=dq^a . q_(n)
> 
> There is definitely work out there on optimization on Lie groups.  I'm
> no expert, so i'll stop writing here before i embarrass myself.
> 
> Anyway, coming back to the practical side, if your angles that you are
> optimizing over do not approach 90 degrees, you can quite happily use
> Euler angles with additive optimizers.  Its an easy way to get a
> general additive optimizer to work on the problem, without worrying
> about compositional update steps.  However, if your problem does
> approach the area where Euler angles break down - watch out.
> 
> Alternatively, like you already said, if you are using similarity
> transforms, then you can use a quaternion transform with an additive
> optimizer.
> 
> Cheers,
> rupert


More information about the Insight-users mailing list