algorithms.clustering.ggmixture

Module: algorithms.clustering.ggmixture

Inheritance diagram for nipy.algorithms.clustering.ggmixture:

digraph inheritanced40e7b7630 { rankdir=LR; size="8.0, 12.0"; "clustering.ggmixture.GGGM" [URL="#nipy.algorithms.clustering.ggmixture.GGGM",fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5)",target="_top",tooltip="The basic one dimensional Gamma-Gaussian-Gamma Mixture estimation"]; "clustering.ggmixture.GGM" [URL="#nipy.algorithms.clustering.ggmixture.GGM",fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5)",target="_top",tooltip="This is the basic one dimensional Gaussian-Gamma Mixture estimation class"]; "clustering.ggmixture.Gamma" [URL="#nipy.algorithms.clustering.ggmixture.Gamma",fontname="Vera Sans, DejaVu Sans, Liberation Sans, Arial, Helvetica, sans",fontsize=10,height=0.25,shape=box,style="setlinewidth(0.5)",target="_top",tooltip="Basic one dimensional Gaussian-Gamma Mixture estimation class"]; }

One-dimensional Gamma-Gaussian mixture density classes : Given a set of points the algo provides approcumate maximum likelihood estimates of the mixture distribution using an EM algorithm.

Author: Bertrand Thirion and Merlin Keller 2005-2008

Classes

GGGM

class nipy.algorithms.clustering.ggmixture.GGGM(shape_n=1, scale_n=1, mean=0, var=1, shape_p=1, scale_p=1, mixt=array([0.33333333, 0.33333333, 0.33333333]))[source]

Bases: object

The basic one dimensional Gamma-Gaussian-Gamma Mixture estimation class, where the first gamma has a negative sign, while the second one has a positive sign.

7 parameters are used: - shape_n: negative gamma shape - scale_n: negative gamma scale - mean: gaussian mean - var: gaussian variance - shape_p: positive gamma shape - scale_p: positive gamma scale - mixt: array of mixture parameter (weights of the n-gamma,gaussian and p-gamma)

__init__(shape_n=1, scale_n=1, mean=0, var=1, shape_p=1, scale_p=1, mixt=array([0.33333333, 0.33333333, 0.33333333]))[source]

Constructor

Parameters

shape_n : float, optional

scale_n: float, optional

parameters of the nehative gamma; must be positive

mean : float, optional

var : float, optional

parameters of the gaussian ; var must be positive

shape_p : float, optional

scale_p : float, optional

parameters of the positive gamma; must be positive

mixt : array of shape (3,), optional

the mixing proportions; they should be positive and sum to 1

parameters()[source]

Print the parameters

init(x, mixt=None)[source]

initialization of the differnt parameters

Parameters

x: array of shape(nbitems)

the data to be processed

mixt : None or array of shape(3), optional

prior mixing proportions. If None, the classes have equal weight

init_fdr(x, dof=-1, copy=True)[source]

Initilization of the class based on a fdr heuristic: the probability to be in the positive component is proportional to the ‘positive fdr’ of the data. The same holds for the negative part. The point is that the gamma parts should model nothing more that the tails of the distribution.

Parameters

x: array of shape(nbitem)

the data under consideration

dof: integer, optional

number of degrees of freedom if x is thought to be a student variate. By default, it is handeled as a normal

copy: boolean, optional

If True, copy the data.

Mstep(x, z)[source]

Mstep of the estimation: Maximum likelihood update the parameters of the three components

Parameters

x: array of shape (nbitem,)

input data

z: array of shape (nbitems,3)

probabilistic membership

Estep(x)[source]

Update probabilistic memberships of the three components

Parameters

x: array of shape (nbitems,)

the input data

Returns

z: ndarray of shape (nbitems, 3)

probabilistic membership

Notes

z[0,:] is the membership the negative gamma z[1,:] is the membership of the gaussian z[2,:] is the membership of the positive gamma

estimate(x, niter=100, delta=0.0001, bias=0, verbose=0, gaussian_mix=0)[source]

Whole EM estimation procedure:

Parameters

x: array of shape (nbitem)

input data

niter: integer, optional

max number of iterations

delta: float, optional

increment in LL at which convergence is declared

bias: float, optional

lower bound on the gaussian variance (to avoid shrinkage)

gaussian_mix: float, optional

if nonzero, lower bound on the gaussian mixing weight (to avoid shrinkage)

verbose: 0, 1 or 2

verbosity level

Returns

z: array of shape (nbitem, 3)

the membership matrix

posterior(x)[source]

Compute the posterior probability of the three components given the data

Parameters

x: array of shape (nbitem,)

the data under evaluation

Returns

ng,y,pg: three arrays of shape(nbitem)

the posteriori of the 3 components given the data

Notes

ng + y + pg = np.ones(nbitem)

component_likelihood(x)[source]

Compute the likelihood of the data x under the three components negative gamma, gaussina, positive gaussian

Parameters

x: array of shape (nbitem,)

the data under evaluation

Returns

ng,y,pg: three arrays of shape(nbitem)

The likelihood of the data under the 3 components

show(x, mpaxes=None)[source]

Visualization of mixture shown on the empirical histogram of x

Parameters

x: ndarray of shape (nditem,)

data

mpaxes: matplotlib axes, optional

axes handle used for the plot if None, new axes are created.

GGM

class nipy.algorithms.clustering.ggmixture.GGM(shape=1, scale=1, mean=0, var=1, mixt=0.5)[source]

Bases: object

This is the basic one dimensional Gaussian-Gamma Mixture estimation class Note that it can work with positive or negative values, as long as there is at least one positive value. NB : The gamma distribution is defined only on positive values.

5 scalar members - mean: gaussian mean - var: gaussian variance (non-negative) - shape: gamma shape (non-negative) - scale: gamma scale (non-negative) - mixt: mixture parameter (non-negative, weight of the gamma)

__init__(shape=1, scale=1, mean=0, var=1, mixt=0.5)[source]

Initialize self. See help(type(self)) for accurate signature.

parameters()[source]

print the paramteres of self

Mstep(x, z)[source]

Mstep of the model: maximum likelihood estimation of the parameters of the model

Parameters

x : array of shape (nbitems,)

input data

z array of shape(nbitrems, 2)

the membership matrix

Estep(x)[source]

E step of the estimation: Estimation of ata membsership

Parameters

x: array of shape (nbitems,)

input data

Returns

z: array of shape (nbitems, 2)

the membership matrix

estimate(x, niter=10, delta=0.0001, verbose=False)[source]

Complete EM estimation procedure

Parameters

x : array of shape (nbitems,)

the data to be processed

niter : int, optional

max nb of iterations

delta : float, optional

criterion for convergence

verbose : bool, optional

If True, print values during iterations

Returns

LL, float

average final log-likelihood

show(x)[source]

Visualization of the mm based on the empirical histogram of x

Parameters

x : array of shape (nbitems,)

the data to be processed

posterior(x)[source]

Posterior probability of observing the data x for each component

Parameters

x: array of shape (nbitems,)

the data to be processed

Returns

y, pg : arrays of shape (nbitem)

the posterior probability

Gamma

class nipy.algorithms.clustering.ggmixture.Gamma(shape=1, scale=1)[source]

Bases: object

Basic one dimensional Gaussian-Gamma Mixture estimation class

Note that it can work with positive or negative values, as long as there is at least one positive value. NB : The gamma distribution is defined only on positive values. 5 parameters are used: - mean: gaussian mean - var: gaussian variance - shape: gamma shape - scale: gamma scale - mixt: mixture parameter (weight of the gamma)

__init__(shape=1, scale=1)[source]

Initialize self. See help(type(self)) for accurate signature.

parameters()[source]
check(x)[source]
estimate(x, eps=1e-07)[source]

ML estimation of the Gamma parameters