[Insight-users] Narrow band to image registration

Luis Ibanez luis.ibanez at kitware.com
Thu May 27 18:05:58 EDT 2004



Hi Eduard,


Why is that "being published"
is a drawback for an algorithm  ?     :-)


--

If you want to try BSplines, probably the best
thing for you to do is to get familiar with the
example:


   Insight/Examples/Registration/
            DeformableRegistration4.cxx




It has a companion example

   Insight/Examples/Registration/
                BSplineWrapping1.cxx


that you can actually use for (at least) two
different purposes:

1) Generate synthetic deformed images
    (e.g. for validation)

2) Manually attempt to deform one of your images
    just to see if a BSpline has the potential for
    correcting the real deformation.



Note that Lydia just perfectioned these examples
last week, so you may have to update your cvs
checkout.



Please let us know if you find any problem
using these examples,


   Thanks


     Luis



--------------------------
Eduard Schreibmann wrote:

> Hi Luis,
> 
> This aproach is nice in general, but it has one major drawback : it was
> already published.
> 
> I'm looking for a more automated aproach, where I do not have to save
> landmarks. That's why was looking to BSplines. I mean, even if some user has
> to deliniate one organ for the narrow band, this is ok as deliniation is
> needed anyway at later stages of the general treatment procedure.
> 
> I tried MRI/CT direct registration with BSplines and FEM, without great
> results. I suppose it is because the images are fuzzy. Was thinking that
> narrow band will help.
> 
> What other options of non-landmark registrations are available as alternatives ?
> 
> Thank you in advance
> Edi
>  
> 
> Quoting Luis Ibanez <luis.ibanez at kitware.com>:
> 
> 
>>Hi Eduard,
>>
>>
>>In order to use the KerneSpline transform
>>you need to provide a two sets of landmarks.
>>
>>One set will be associated to the undeformed
>>image while the other set is associated to
>>the deformed image.
>>
>>Currently this transform does not provides
>>Jacobians, so you will have to use it with an
>>optimizer that does not require derivatives of
>>the cost function. Probably the best option is
>>the OnePlusOne evolutionary algorithm.
>>
>>
>>You may find useful to take a look at the
>>morphing example:
>>
>>
>>    Insight/Examples/Registration/
>>                 LandmarkWarping2.cxx
>>
>>that will give you some guidance on how to
>>setup a KernelTransform.
>>
>>
>>
>>    Regards,
>>
>>
>>       Luis
>>
>>
>>
>>--------------------------
>>Eduard Schreibmann wrote:
>>
>>
>>>The translation transform works fine, and it can find now the correct
>>>displacement. It seems also that the affine transform works fine, but
>>
>>would
>>
>>>like to test also the kernel splines, as you suggested. The deformation
>>
>>is
>>
>>>quite high because the image is taken with an inserted probe that
>>
>>deforms
>>
>>>all organs around.
>>>
>>>What steps should I do to connect a kernel splines transform, what is
>>
>>the
>>
>>>scenario ?
>>>
>>>Thank you in advance,
>>>Edi
>>>
>>>
>>>Quoting Luis Ibanez <luis.ibanez at kitware.com>:
>>>
>>>
>>>
>>>>Hi Eduard,
>>>>
>>>>
>>>>1) Won't be able to help you with the code
>>>>   that doesn't work... unless you tell us
>>>>   what happens when you run it:
>>>>
>>>>   - does it crash  ?
>>>>   - does it aborts ?
>>>>   - does it runs indefinitely ?
>>>>
>>>>   a bit more information may be useful  :-)
>>>>
>>>>
>>>>2)  The Kernel Splines simply use the point
>>>>    pairs in order to define Vectors, then
>>>>    interpolates those vectors in space.
>>>>    You can think of it just as the BSpline
>>>>    deformable transform: A mechanism for
>>>>    representing a deformation field.
>>>>
>>>>    More details on how this transforms work
>>>>    are available at:
>>>>http://www.itk.org/Insight/Doxygen/html/classitk_1_1KernelTransform.html
>>>>
>>>>    and of course in the original paper
>>>>
>>>>     IEEE TMI,  Vol. 16, No. 3 June 1997
>>>>     Davis, Khotanzad, Flamig, and Harms,
>>>>
>>>>
>>>>
>>>>3)  If you MRI image is hightly inhomogeneous, you
>>>>    should apply first the correction method in
>>>>
>>>>        InsightApplications/MRIBiasCorrection
>>>>
>>>>    You will find instructions in the README
>>>>    file in that directory.
>>>>
>>>>
>>>>
>>>>4)  You can choose any image as the fixed
>>>>    or moving image. That shouldn't be a
>>>>    problem. The fundamental fact is that
>>>>    whatever you choose as the fixed image,
>>>>    that's the one you will have to segment
>>>>    in order to produce a PointSet representing
>>>>    the narrow band.
>>>>
>>>>
>>>>5)  For quantifying the deformation field and
>>>>    the quality of the registration, the easiest
>>>>    thing to do is to load all of them in ParaView.
>>>>
>>>>    you can download sources and binaries of
>>>>    ParaView for free at:
>>>>
>>>>          http://www.paraview.org
> 
> 
>>>>    by overlaping the deformation field to the
>>>>    fixed and moving images you can visually
>>>>    verify the correctness of the deformation.
>>>>
>>>>    Once the results are visually satisfactory,
>>>>    you could move to other ways of quantifying
>>>>    registration quality.
>>>>
>>>>
>>>>
>>>>
>>>>Regards,
>>>>
>>>>
>>>>
>>>>   Luis
>>>>
>>>>
>>>>
>>>>--------------------------
>>>>Eduard Schreibmann wrote:
>>>>
>>>>
>>>>
>>>>>First of all, thank you for your fast replay and hints. Did changes
>>
>>as
>>
>>>>you 
>>>>
>>>>
>>>>>said, seems to be ok as code, does not work in tests for some reason.
>>>>>
>>>>>There are two theoretical things that are not clear
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>>  I'll strongly suggest you to start with an AffineTransform
>>>>>>  and if that proves to be insufficient then try the KernelSplines.
>>>>>
>>>>>
>>>>>The kernel splines are not supposed to use 2 set of points, or I'm 
>>>>>understanding wrong ? If indeed 2 sets of points are needed, what
>>
>>would
>>
>>>>be 
>>>>
>>>>
>>>>>the second set ? 
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>>b)
>>>>>>
>>>>>>There is more information about this method in a paper that
>>>>>>was submitted to a Journal last year, and you probably will
>>>>>>be able to read it next year...
>>>>>>
>>>>>>Sorry the InsightJournal wasnt' there at the time  :-)
>>>>>>
>>>>>
>>>>>
>>>>>I suppose there isn't any possibility to read the manuscript. I
>>
>>"moved"
>>
>>>>>recently to the registration stuff, any theoretical info would help a
>>>>
>>>>lot.
>>>>
>>>>
>>>>>Or, can have the references list of the manuscript, or some
>>
>>references
>>
>>>>to 
>>>>
>>>>
>>>>>better understand the mathematics behind the code ?
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>>----------------
>>>>>>
>>>>>>d)
>>>>>>
>>>>>>Yes the narrow band will work for Fuzzy image, because
>>>>>>*you* will segment them first   :-)
>>>>>
>>>>>
>>>>>The "problem" is that I prefer to have the mask/segmentation on the
>>
>>MRI
>>
>>>>>image and the fuzzy CT image as moving, otherwise we have to have 2 
>>>>>segmentations :) And the MRI image is also strange, lot of high
>>>>
>>>>intensity 
>>>>
>>>>
>>>>>pixels near the rectum falling of almost exponentially to some
>>>>
>>>>background 
>>>>
>>>>
>>>>>like intensities.
>>>>>
>>>>>What is better to have as moving image, a fuzzy image or a 
>>>>>highly "inhomogeneous" (as pixel values) image.
>>>>>
>>>>>How it is best to measure how good this algorithm is ? 
>>>>>I'm trying to warp the CT image and see how well the registration can
>>
>>>>>deformed it back. Is there a better way to "quantify" how good the 
>>>>>deformation is, except plotting the convergence or checkboard ?
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>>Please let us know if you have further questions.
>>>>>>
>>>>>
>>>>>
>>>>>Thank you bery much for your replays, they really help.
>>>>>Edi
>>>>>
>>>>>
>>>>
>>>>
>>>>
>>>>
>>>
>>>
>>
>>
>>
> 
> 
> 





More information about the Insight-users mailing list