[Insight-users] Image similarity]
George Iordanescu
giordanescu at cmr.nunet.net
Fri Jul 16 19:43:13 EDT 2004
Hi Luis,
Thank you for the refresher course in Information theory. :)
You are right, MI has nothing to do with [-1,1] range. For some reason I
was thinking about the metric as a cross-correlation coefficient - which
of course is wrong.
The reason why the bit the unit of information and the bit as a symbol
of binary encoding are somehow the same thing, is given by information
transmission rather than storage. When you transmit a bit over a
channel, and on the other side of the channel you read 0 or 1, it means
you actually find out that an event with probability 0.5 just happened.
The quantity of information you get when an event takes place is given
by -log2(p) => so you actually get 1 bit of info. When you have a
hard-drive of 10 GB, it means you can store/read 10 G bytes, or 80
gigabits = 80 giga of events that have 0.5 probability of being 1 or 0.
Thank you for the new application that computes the entropy.
George
On Fri, 2004-07-16 at 16:38, Luis Ibanez wrote:
> Hi George,
>
>
> A) About the bounds for Mutual Information:
>
> Mutual Information it *is not* supposed to
> be in the range [-1:1]. !!
>
>
> When you consider the pixel values of
> images A and B to be random variables,
> "a" and "b"; and estimate the entropy of
> their distributions you get
>
> H(A) = - Sum( p(a)* log2( p(a) ) )
> H(B) = - Sum( p(b)* log2( p(b) ) )
>
> Note that log2 is the logarithm in base
> two, not the natural logarigthm that
> unfortunately is commonly used.
>
> When you use log2(), the Units of entropy
> are "bit"s. It is again unfortunate that
> the usage of "bit" as unit of information
> measurement has been distorted in order
> to become the a symbol for binary encoding,
> or a unit of measurement for raw capacity
> of memory storage (along with its derivatives
> the byte, KiloByte, MegaByte... )
>
> A digital image whose pixels are encoded in
> pixels of M bits, can have 2^M different grayscale
> values in a pixel and therfore its entropy can go
> up to the maximum theoretical value of log2(2^M)
> which, not coincidentally is equal to M.
>
> In other words, if you compute the entropy of
> an image with PixelType unsigned char, whose
> pixels have grayscale values following a uniform
> distribution, the maximum value that you can get
> is "8", and if you want to be formal, you should
> mention the units and say:
>
> The Entropy of this images is: "8 bits"
> In practice, of course you get lower values.
> For example the Entropy of the well known Lena
> image (the cropped version that is politicaly
> correct) is
>
> Lena Entropy = 7.44 bits
>
>
> Now if you consider the mutual information measure,
> you have the following situation:
>
>
> Mutual Information = H(A) + H(B) - H(A,B)
>
> MI(A,B) = H(A) + H(B) - H(A,B)
>
> In general, both H(A) and H(B) are bounded between
> [0:M] where "M" is the number of bits used for
> encoding their pixels.
>
> H(A,B) is in theory bounded in [0:M^2]
>
>
> Note that if you use histograms with *less* bins
> than the actual digital encoding of the image,
> then your estimation of Entropy is bounded by
> the number of bins in your Histogram. !
>
> For example if you use a histogram with 20 bins
> instead of 256 in order to estimate Lena's Entropy
> you will not get 7.44 bits, but only
>
> 3.99 bits
>
> that reflects the fact that by quantizing the
> gray scale values in larger bins you loseinformation
> from the original image.
>
>
>
> For the particular case of Self-similarity that you
> are considering, the entropies H(A) and H(B) are
> expected to be pretty much the same. Their difference
> arise only from interpolation errors and from the
> eventual effect of one image having corners outside
> of the extent of the other (e.g. if the image is rotated).
>
> So, in general Mutual Information will give you
>
> MI(A,T(A)) = H(A) + H(T(A)) - H(A,T(A))
>
> Where T(A) is the Transformed version of A. E.g. under
> a translation, or rotation, or affine transform.
>
> If T = Identiy and the interpolator is not approximating,
> your measure of Mutual Information becomes
>
> MI(A,A) = 2 H(A) - H(A,A)
>
> and the joint entropy H(A,A) happens to be equal to
> the entropy of the single image H(A), therefore the
> expected value of Mutual Information is equal to the
> image Entropy (of course measured in bits).
>
> MI(A,A) = H(A) bits
>
> That means that if you evaluate the Mutual Information
> measure between Lena and itself, you should get
>
> 7.44 bits
>
>
> Note that the reason why the measure of Mutual Information
> is reported as a negative number in ITK is because traditionally
> that has been used as a cost function for minimization.
>
> However, in principle, Mutual information should be
> reported in the range [0,H(A)], where zero corresponds
> to two totally uncorrelated images and H(A) corresponds
> to perfectly correlated images, case in which H(A)=H(B).
>
>
> To summarize, note that ITK is not using log2() but
> just log(), and note that the measure is reported as
> a negative number.
>
>
> We just added a simple example for computing the Entropy
> of the pixel value distribution of an image to the
> directory:
>
> Insight/Examples/Statistics/
> ImageEntropy1.cxx
>
> You may find interesting to play with this example.
> E.g. you should try the effect of chaning the number
> of bins in the histogram.
>
> Note that you will have to update your CVS checkout
> in order to get this new file.
>
>
>
> B) About Interpolation:
>
> Cubic BSplines are approximating splines.
> The interpolated values do not pass through
> the nodes.
>
> Linear interpolation should give you values
> that pass through the nodes. In general linear
> interpolation provides a good trade-off between
> computation load and interpolation quality.
>
>
>
> Please let us know if you have further questions,
>
>
>
> Thanks,
>
>
>
> Luis
>
>
>
> ---------------------------
> George Iordanescu wrote:
>
> > Hi Luis,
> >
> > Thank you for your reply.
> >
> > Some of the spline inerpolators compute a curve that passes between
> > nodes some build a curve that passes exactly through the nodes. If the
> > itk spline interpolators are of the first type then we should indeed
> > avoid them when computing self similarity. Linear interpolator should
> > be the best in this case since it will provide the same values in the
> > nodes but better approximations between the nodes.
> >
> > My function seems to work: the similarity values increase as the
> > registered image gets closer to the moving image. I can now evaluate
> > much easier the parameters for fem registration.
> >
> > I still do not know if self similarity values of -1.37055 are good.
> > Aren't they supposed to be between -1 and 1?
> >
> > Is there another metric that is more precise?
> >
> > Thank you very much for your help.
> >
> > George
> >
> >
> > On Fri, 2004-07-16 at 09:51, Luis Ibanez wrote:
> >
> >>Hi George,
> >>
> >>1) NearestNeighborhood interpolation *does*
> >> take origin and spacing into account.
> >>
> >>2) The choice of the interpolator Matters
> >> in self-similarity becuse some interpolator
> >> are approximating. For example, BSplines
> >> *do not* return the same value of a node
> >> when you evaluate them in the position of
> >> the node.
> >>
> >>3) From your message it wasn't clear if you
> >> are still experiencing any problem, or
> >> if your current approach is working for
> >> your application.
> >>
> >>
> >>Please let us know if you still find any
> >>difficulties.
> >>
> >>
> >> Regards,
> >>
> >>
> >> Luis
> >>
> >>
> >>---------------------------
> >>George Iordanescu wrote:
> >>
> >>
> >>>Hi Luis,
> >>>
> >>>Thank you for your reply. I think in general the interpolator should be
> >>>smarter than nearest neghbor, since the 2 images may have different
> >>>spacings/origins... Also, in the case of self-similarity, the nearest
> >>>neighbor, linear interpolation and in general any other interpolation
> >>>should not make any difference, right?
> >>>
> >>>In the mean time i found a possible eror in the my function
> >>>(SetFixedImageRegion() should use the region of the fixed image). The
> >>>new code (including the interpolator you suggested) is this:
> >>>
> >>>double Compute_3DImages_Similarity(DataManager::ImagePointer image1,
> >>>DataManager::ImagePointer image2)
> >>>{
> >>> /** Type of the metric. */
> >>> //typedef itk::ImageToImageMetric< ImageType, ImageType >
> >>>MetricType;
> >>> typedef itk::MattesMutualInformationImageToImageMetric< ImageType,
> >>>ImageType > MetricType;
> >>> typedef itk::NearestNeighborInterpolateImageFunction<ImageType, double
> >>>
> >>>
> >>>>InterpolatorType ;
> >>>
> >>> //typedef itk::LinearInterpolateImageFunction< ImageType, double >
> >>>InterpolatorType ;
> >>> typedef itk::IdentityTransform< double, 3 > TransformType;
> >>> typedef MetricType::Pointer MetricPointer;
> >>>
> >>> /** Type of the Transform . */
> >>> typedef TransformType::Pointer TransformPointer;
> >>> typedef MetricType::ParametersType ParametersType;
> >>> typedef InterpolatorType::Pointer InterpolatorPointer;
> >>>
> >>> TransformPointer m_Transform = TransformType::New();
> >>> MetricType::Pointer m_Metric = MetricType::New();
> >>> InterpolatorPointer m_Interpolator = InterpolatorType::New();
> >>> m_Metric->SetFixedImage( image1 );
> >>> m_Metric->SetMovingImage( image2 );
> >>> m_Metric->SetTransform( m_Transform );
> >>> m_Metric->SetInterpolator( m_Interpolator );
> >>> const unsigned int dummyspaceDimension =
> >>> m_Metric->GetNumberOfParameters();
> >>> ParametersType dummyPosition(dummyspaceDimension) ;
> >>> for( unsigned int i=0; i<dummyspaceDimension; i++){
> >>> dummyPosition[i] = 0;
> >>> }
> >>> m_Metric->SetFixedImageRegion( image1->GetLargestPossibleRegion() );
> >>>
> >>> m_Metric->Initialize();
> >>>
> >>> double simi = m_Metric->GetValue(dummyPosition);
> >>> return simi;
> >>>}
> >>>
> >>>These are the new results:
> >>>
> >>>Before the registration:
> >>>Dbg: Similarity between fixed image and itself = -1.32591
> >>>Dbg: Similarity between moving image and itself = -1.37055
> >>>Dbg: Similarity between moving and fixed image = -0.561616
> >>>
> >>>Registration evolution:
> >>>Dbg: Rigid registration INITIAL centered versor transform:
> >>>[0, 0, 0, 0, 0, 0]
> >>>
> >>>Dbg: 1 = -0.435078 : [0, 0, 0, 0, 0, 0]
> >>>Dbg: 4 = -0.455245 : [-0.000998133, -0.00459843, -0.00371794,
> >>>-0.0260141, -0.312529, 0.38784]
> >>>Dbg: 8 = -0.522174 : [0.00233522, -0.00242162, 0.000783394, -1.412,
> >>>0.610578, 0.187857]
> >>>Dbg: 13 = -0.554342 : [-4.22208e-05, 0.000137454, 0.0032345, -1.26255,
> >>>-0.195143, -0.307137]
> >>>Dbg: 244 = -0.559978 : [-0.00272003, 0.00371769, 0.00131278, -1.32268,
> >>>-0.105536, -0.0930287]
> >>>Dbg: 426 = -0.562566 : [-0.00261726, 0.011476, 0.00557111, -1.07907,
> >>>-0.185965, 0.139911]
> >>>Dbg: Rigid registration FINAL centered versor transform:
> >>>[-0.00261726, 0.011476, 0.00557111, -1.07907, -0.185965, 0.139911]
> >>>
> >>>Dbg: Rigid registration FINAL affine transform:
> >>>[0.999675, -0.0112013, 0.0229208, 0.0110812, 0.999924, 0.00536195,
> >>>-0.0229791, -0.00510621, 0.999723, -0.904836, -0.624775, 1.12567]
> >>>
> >>>After registration:
> >>>Dbg: Simi moving image - registered image = -0.587385;
> >>>Simi fixed image - registered image = -0.659573
> >>>
> >>>In the lines above, 426 = -0.562566 : [-0.00261726, 0.011476,
> >>>0.00557111, -1.07907, -0.185965, 0.139911]
> >>>means that at iteration 426, the cost function was -0.562566 and the crt
> >>>transform was [-0.00261726, 0.011476, 0.00557111, -1.07907, -0.185965,
> >>>0.139911]
> >>>
> >>>My images (MRI) are 256*256*20, their spacing is [0.273438 0.273438 1]
> >>>and one image is obtained by flipping the other on the X axis. There is
> >>>a 4 to 5 pixels horizontal misalignement between them, that I am able to
> >>>fix using the rigid registration. After this, I was hoping to high-light
> >>>the activation area by FEM registration followed by substraction and
> >>>thresholding (to keep only, lets say, the negative values).
> >>>
> >>>I am using the RigidRegistrator from the
> >>>LandmarkInitializedMutualInformationRegistration example from ITK.
> >>>
> >>>George
> >>>
> >>>On Tue, 2004-07-13 at 16:44, Luis Ibanez wrote:
> >>>
> >>>
> >>>>Hi George,
> >>>>
> >>>>Please try using the NearestNeighorhood interpolator instead of
> >>>>the LinearInterpolator. In that way you should be able to get
> >>>>closer to the the theoretical values for self-similarity.
> >>>>
> >>>>Note that you will not get the perfect -1, because the Mutual
> >>>>Information Metrics can only produce "estimates" of the joint
> >>>>probability distributions, so even when computing the metric
> >>>>for two identical images, the estimation process will result in
> >>>>values that do not math the theoretical value.
> >>>>
> >>>>
> >>>>Could you be more specific regarding the "rigid registration app"
> >>>>that you tried ? there are actually several of them in ITK.
> >>>>
> >>>>
> >>>>Thanks
> >>>>
> >>>>
> >>>> Luis
> >>>>
> >>>>
> >>>>-------------------------
> >>>>George Iordanescu wrote:
> >>>>
> >>>>
> >>>>
> >>>>>Hi Luis,
> >>>>>
> >>>>>Thank you for your quick reply. Below is the function I wrote and it
> >>>>>seems to work somehow... I started with 2 3D images whose similarity was
> >>>>>-0.51411. Also:
> >>>>>Similarity between fixed image and itself = -1.32591
> >>>>>Similarity between moving image and itself = -1.37055
> >>>>>
> >>>>>After rigid registration :
> >>>>>Similarity between moving image and registered image = -0.782605;
> >>>>>Similarity between fixed image and registered image = -0.651837 ;
> >>>>>
> >>>>>I am a bit surprised by the self-similarity values for the fixed and
> >>>>>moving image. Aren't they supposed to be between -1 and 1?
> >>>>>
> >>>>>Also, the rigid registration app (that uses the same type of metric and
> >>>>>interpolator as my function) reports a metric value -0.435078 for the
> >>>>>first iteration (transform parameters are [0, 0, 0, 0, 0, 0]). This
> >>>>>value is close to the -0.51411 computed by my function but still a bit
> >>>>>different. I guess there is some variation in the results provided by
> >>>>>the metric.
> >>>>>
> >>>>>I do not know as much as I would like about template programming but I
> >>>>>will try to write a class templated over the Metric type and the
> >>>>>Interpolator type.
> >>>>>
> >>>>>Thank you.
> >>>>>
> >>>>>George
> >>>>>
> >>>>>-------------defined in DataManager class-----------
> >>>>> typedef signed long ImagePixelType;
> >>>>> typedef itk::Image<ImagePixelType,3> ImageType;
> >>>>> typedef ImageType::Pointer ImagePointer ;
> >>>>>----------------------------------------------------
> >>>>>
> >>>>>double Compute_3DImages_Similarity(DataManager::ImagePointer image1,
> >>>>>DataManager::ImagePointer image2)
> >>>>>{
> >>>>>
> >>>>> /** Type of the metric. */
> >>>>> //typedef itk::ImageToImageMetric< ImageType, ImageType >
> >>>>>MetricType;
> >>>>> typedef itk::MattesMutualInformationImageToImageMetric< ImageType,
> >>>>>ImageType > MetricType;
> >>>>> typedef itk::LinearInterpolateImageFunction< ImageType, double >
> >>>>>InterpolatorType ;
> >>>>> typedef itk::IdentityTransform< double, 3 > TransformType;
> >>>>> typedef MetricType::Pointer MetricPointer;
> >>>>>
> >>>>> /** Type of the Transform . */
> >>>>> typedef TransformType::Pointer TransformPointer;
> >>>>> typedef MetricType::ParametersType ParametersType;
> >>>>> typedef InterpolatorType::Pointer InterpolatorPointer;
> >>>>>
> >>>>> TransformPointer m_Transform = TransformType::New();
> >>>>> MetricType::Pointer m_Metric = MetricType::New();
> >>>>> InterpolatorPointer m_Interpolator = InterpolatorType::New();
> >>>>> m_Metric->SetMovingImage( image1 );
> >>>>> m_Metric->SetFixedImage( image2 );
> >>>>> m_Metric->SetTransform( m_Transform );
> >>>>> m_Metric->SetInterpolator( m_Interpolator );
> >>>>> m_Metric->SetFixedImageRegion( image1->GetLargestPossibleRegion() );
> >>>>>
> >>>>> const unsigned int dummyspaceDimension =
> >>>>> m_Metric->GetNumberOfParameters();
> >>>>> ParametersType dummyPosition(dummyspaceDimension) ;
> >>>>> for( unsigned int i=0; i<dummyspaceDimension; i++){
> >>>>> dummyPosition[i] = 0;
> >>>>> }
> >>>>>
> >>>>> m_Metric->Initialize();
> >>>>>
> >>>>> double simi = m_Metric->GetValue(dummyPosition);
> >>>>> return simi;
> >>>>>}
> >>>>>
> >>>>>On Mon, 2004-07-12 at 19:17, Luis Ibanez wrote:
> >>>>>
> >>>>>
> >>>>>
> >>>>>>>Hi George,
> >>>>>>>
> >>>>>>>You can use any of the current ImageToImage metrics,
> >>>>>>>
> >>>>>>>http://www.itk.org/Insight/Doxygen/html/group__RegistrationMetrics.html
> >>>>>>>
> >>>>>>>and simply set
> >>>>>>>
> >>>>>>> Interpolator = NearestNeighborhoodInterpolator
> >>>>>>> Transform = IdentityTransform
> >>>>>>>
> >>>>>>>If you find annoying to have to do this every time,
> >>>>>>>you could write a class templated over the Metric
> >>>>>>>type, that will do this initializations for you.
> >>>>>>>
> >>>>>>>This new class will have a GetValue() method without
> >>>>>>>parameters that will simply invoke the GetValue() method
> >>>>>>>of the real metric with a dummy array of parameters.
> >>>>>>>You can use a dummy array because the IdentityTransform
> >>>>>>>do not require parameters.
> >>>>>>>
> >>>>>>>In this way you can easily switch among any of the
> >>>>>>>twelve or so image metrics in ITK and dont have to
> >>>>>>>worry about the extra componets required for integration
> >>>>>>>with the registration framework.
> >>>>>>>
> >>>>>>>
> >>>>>>>You may want to contribute such a class to the toolkit
> >>>>>>>since it may prove to be useful to other ITK users.
> >>>>>>>
> >>>>>>>
> >>>>>>>Please let us now if you need any assistance for
> >>>>>>>writing this new class.
> >>>>>>>
> >>>>>>>
> >>>>>>> Thanks
> >>>>>>>
> >>>>>>>
> >>>>>>> Luis
> >>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>>--------------------------
> >>>>>>>George Iordanescu wrote:
> >>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>>>Hello everybody,
> >>>>>>>>
> >>>>>>>>I would like to compute a number that describes the similarity degree
> >>>>>>>>between two images. I am thinking about a simple class that should have
> >>>>>>>>two methods SetImage1/2 and then ComputeSimilarity().
> >>>>>>>>
> >>>>>>>>I was hoping a child of ImageToImageMetric would do, but it seems to
> >>>>>>>>need more inputs than just two images. Apparently, after setting the
> >>>>>>>>images (and the Interpolator and the (identity) transform), I cannot
> >>>>>>>>call GetValue without passing some parameters that make little sense in
> >>>>>>>>the context of just two images. Is there any other class (filter) that
> >>>>>>>>will compute the similarity between two images? Is there any example
> >>>>>>>>about how to use a ImageToImageMetric for such a purpose?
> >>>>>>>>
> >>>>>>>>Such a class could be very useful when trying to find the best set of
> >>>>>>>>registration parameters. One could register an image using multiple sets
> >>>>>>>>of parameters and then we could select the best set by looking at the
> >>>>>>>>similarity between the registered image and the fixed image.
> >>>>>>>>
> >>>>>>>>Any suggestion will be highly appreciated.
> >>>>>>>>
> >>>>>>>>Thank you.
> >>>>>>>>
> >>>>>>>>George
> >>>>>>>>
> >>>>>>>>_______________________________________________
> >>>>>>>>Insight-users mailing list
> >>>>>>>>Insight-users at itk.org
> >>>>>>>>http://www.itk.org/mailman/listinfo/insight-users
> >>>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>>_______________________________________________
> >>>>>>>Insight-users mailing list
> >>>>>>>Insight-users at itk.org
> >>>>>>>http://www.itk.org/mailman/listinfo/insight-users
> >>>>>>>
> >>>>>
> >>>>>
> >>>>>_______________________________________________
> >>>>>Insight-users mailing list
> >>>>>Insight-users at itk.org
> >>>>>http://www.itk.org/mailman/listinfo/insight-users
> >>>>>
> >>>>
> >>>>
> >>>>
> >>>>_______________________________________________
> >>>>Insight-users mailing list
> >>>>Insight-users at itk.org
> >>>>http://www.itk.org/mailman/listinfo/insight-users
> >>>>
> >>>
> >>>
> >>>_______________________________________________
> >>>Insight-users mailing list
> >>>Insight-users at itk.org
> >>>http://www.itk.org/mailman/listinfo/insight-users
> >>>
> >>
> >>
> >>
> >>
> >
> > _______________________________________________
> > Insight-users mailing list
> > Insight-users at itk.org
> > http://www.itk.org/mailman/listinfo/insight-users
> >
>
>
>
> _______________________________________________
> Insight-users mailing list
> Insight-users at itk.org
> http://www.itk.org/mailman/listinfo/insight-users
>
More information about the Insight-users
mailing list