next up previous contents
Next: Bibliography Up: Habilitation Previous: Conclusions   Contents

General conclusions and perspectives

We presented in this work several algorithms for solving image processing and data assimilation problems. All these algorithms are robust, easy to implement, fast and powerful. This work has been essentially motivated by the applications of such problems. In the case of image processing, one of these constraints could be to be able to process movies in real time or large images in a negligible time. For data assimilation problems, the goal is to assimilate a huge amount of data in a given time, bounded by some operational contraints (e.g. of providing some short or medium-range weather forecasts in a given time).

It seemed crucial to us to develop some algorithms that are quite far from the state of the art in both image processing and data assimilation. For instance, the topological gradient has been introduced in the image processing field, providing a more global information than the standard gradient of the image. Also, the data assimilation community is currently split into two parts: variational and sequential methods. The first ones (e.g. the 4D-VAR algorithm) require a huge human cost for the implementation of the adjoint code, and the second ones (e.g. Kalman filters) rely on the very precise knowledge of the error statistics. Thus, we made the choice of introducing an algorithm at the interface of these two categories, in order to combine the advantages without the main drawbacks.

There are still many perspectives in these research fields, because some problems have not been studied yet, and also because our algorithms can still be improved. For instance, all the algorithms introduced for image processing problems are based on the edge detection by topological gradient. It seems interesting to define more than two conductivity values, in order to identify more than one edge set, as the edges do not correspond to the same level of discontinuities. In data assimilation, the back and forth nudging algorithm can also be improved, for instance by automatically decreasing or increasing the gain coefficients with the iterations, in order to keep a relative equilibrium between the physical model and the feedback to the observations.

As long term perspectives in image processing, we can cite for instance the compression and deblurring problems, for which it should also be possible to define an approach by topological asymptotic analysis. Also, an interesting challenge in data assimilation is to test the back and forth nudging algorithm on a primitive equation model with real data, in order to study the behaviour of this algorithm in real conditions.


next up previous contents
Next: Bibliography Up: Habilitation Previous: Conclusions   Contents
Back to home page