next up previous contents
Next: Segmentation Up: Classification Previous: Restoration and classification coupling   Contents

Extension to unsupervised classification

If the number $ n$ of classes is given, but not their values $ C_i$ , it is possible to determine them in an optimal way. This classification problem can be defined as

$\displaystyle \min_{(\Omega_i),(C_i)} j((\Omega_i),(C_i)) = \sum_{i=1}^n \int_{\Omega_i} \vert v(x)-C_i\vert^2\, dx + \alpha \sum_{i\ne j} \vert\Gamma_{ij}\vert.$ (2.38)

The idea is to minimize the cost function $ j((\Omega_i),(C_i))$ alternatively with respect to $ \Omega_i$ and with respect to $ C_i$ . The minimization with respect to $ \Omega_i$ consists of classifying the image, while the minimization with respect to $ C_i$ is obtained straightforward by the mean of the image in each class:

$\displaystyle C_i = \frac{1}{\vert\Omega_i\vert} \int_{\Omega_i} v(x)\,dx.$ (2.39)

The unsupervised classification algorithm is then as follows:

$ \bullet$
Initialization: define an initial guess $ C_1,\dots,C_n$ (e.g. equi-distributed classes).
$ \bullet$
Repeat until convergence:

If the number $ n$ of classes is not given, we can add a penalization term ``$ +\beta n$ '' in the cost function (2.38), measuring the number of classes. The minimization with respect to $ n$ provides the optimal number of classes. The number of classes is clearly related to the choice of the weighting coefficient $ \beta$ .


next up previous contents
Next: Segmentation Up: Classification Previous: Restoration and classification coupling   Contents
Back to home page