[Insight-users] Re: itkExpectationMaximizationMixtureModelEstimator

Jisung Kim bahrahm@yahoo.com
Thu, 24 Oct 2002 11:04:47 -0700 (PDT)


Hi Luis,

Your description is correct. 

Users of the EM framework should instantiate as many
"Components" as the classes they want to classify.
then plug those components to the EM estimator with
the initial proportions ( array of doubles. e.g.
0.19002, 0.7800, ...). The "estimator" will compute
new weights for the whole sample populator for each
component (class), and the component will update its
distribution parameters ( e.g. mean and covariance for
Gaussian component) using the weights. The estimator
will repeat this process until the changes in
distribution parameters are less than some small
values or the number of iteration hits the maximum
iteration number.

Current EM framework assumes that each "component"
should have its own analytical solution for updating
its parameters. This is true for exponential
distribution families such as Gaussian. 


--- Luis Ibanez <luis.ibanez@kitware.com> wrote:
> 
> Hi digvijay,
> 
> I may be wrong in my interpretation...
> (Jisung, please correct me if this is not
> the right description)
> 
> --
> 
> The MixtureModelExtimator takes a population
> of samples and classify them in groups
> (classes).  The samples will be partitioned
> by assigning each one of them to one specific
> class.
> 
> The array of Proportions indicates the
> percent of samples belonging to each class.
> 
> Let's say that you are classifying brain
> tissues into
> 
>      1 - white matter
>      2 - gray matter
>      3 - CSF
> 
> You will have three classes, plus one for
> rejection. If your image has N pixels and
> you anticipate that:
> 
>      n1 pixel will belong to white matter
>      n2 pixel will belong to gray matter
>      n3 pixel will belong to CSF
>      n4 pixel are in a rejection class
>               (e.g. bone, skin, background )
> 
> with     N = n1 + n2 + n3 + n4
> 
> This is apriori knowledge based on a typical
> brain.
> 
> In your case, for the Histogram you can
> assume that the gray level distribution
> of each tissue is a gaussian. The total
> histogram is then the composition of four
> gaussians.
> 
> You can load the initial proportions array
> (which is of size = 4) with the values:
> 
>     {  n1/N ,  n2/N , n3/N , n4/N }
> 
> 
> Then, run the estimator. When the estimator
> is done the samples are now classified into
> the four groups and new final proportions
> are computed by the estimator.
> 
> The TrueProportions array (size=4) will contain
> the final percents of samples (pixels in this
> case) belonging to each class.
> 
> Note that the Estimator let you select the
> type of distribution for each class. This is
> done with the ComponentType. In the case of
> Gaussian distributions you may use:
> 
> itk::stat::GaussianMixtureModelComponent<
> DataSampleType >
> 
> -----
> 
> You may see an example of this class at
> 
>       Insight/Testing/Code/Numerics/Statistics/
> 
>
itkExpectationMaximizationMixtureModelEstimatorTest.cxx
> 
> 
> Please let us know if you have further questions.
> 
> 
> Thanks
> 
> 
>      Luis
> 
> 
> ==================================================
> 
> digvijay singh wrote:
>   >
>  > hi luis !!
>  > i had a look at the
>  > itkExpectationMaximizationMixtureModelEstimator
>  > could you please give me some more info on the
> use of
>  > trueproportions[] and the initialproportions[].
>  > thanks
>  > digvijay
>  >
>  >
>  >
>  >
> 
> 
> 
> 
> _______________________________________________
> Insight-users mailing list
> Insight-users@public.kitware.com
>
http://public.kitware.com/mailman/listinfo/insight-users


=====
Jisung Kim
bahrahm@yahoo.com
106 Mason Farm Rd.
129 Radiology Research Lab., CB# 7515
Univ. of North Carolina at Chapel Hill
Chapel Hill, NC 27599-7515

__________________________________________________
Do You Yahoo!?
Yahoo! Health - Feel better, live better
http://health.yahoo.com