next up previous contents
Next: Theoretical convergence results Up: Numerical experiments Previous: Layered quasi-geostrophic ocean model   Contents

Conclusions emerging from the numerical experiments

The BFN algorithm appears to be a very promising data assimilation method. It is extremely easy to implement: no linearization of the model equations, no computation of the adjoint state, no optimization algorithm. The only necessary work is to add a relaxation term to the model equations. The key point in the backward integration is that the nudging term (with the opposite sign to the forward integration one) makes it numerically stable. Hence the nudging (or relaxation) term has a double role: it forces the model to the observations and it stabilizes the numerical integration. It is simultaneously a penalization and regularization term.

The BFN algorithm has been compared with the variational method on several types of non-linear systems: Lorenz (chaotic 1D ODE), Burgers (1D PDE), shallow water model (2D) and quasi-geostrophic model (3D). The conclusion of these various experiments is that the BFN algorithm is better than the variational method for the same number of iterations (and hence for the same computing time). It converges in a small number of iterations. Of course the initial condition is usually poorly identified by the BFN scheme, but on the other hand, the final state of the assimilation period is much better identified by the BFN algorithm than by the variational assimilation algorithm, which is a key point for the prediction phase that starts at the end of the assimilation period. Hence the prediction phase is usually better when it comes after an assimilation period treated by the BFN algorithm, rather than by a variational assimilation method.

The two algorithms can be combined: we have introduced a new hybrid scheme, in which a very small number of BFN iterations are performed (2 or 3 for instance), before providing the identified initial condition to the standard 4D-VAR algorithm. By doing this, the convergence of the 4D-VAR is reached more quickly, as it sometimes divides by two the number of iterations required. Also, for a fixed given number of iterations (or for a given computation time), the quality of the identified solution is significantly improved by this preprocessing (note that the number of 4D-VAR iterations is decreased by the number of BFN iterations in this scheme, in order to consider the same number of iterations in the standard 4D-VAR and the hybrid scheme).

Finally the BFN algorithm enables one to consider the problem of imperfect models at no additional cost, as the model equations are not strong constraints in this nudging method (while they are usually strong constraints in a variational method) and the relaxation term can be seen as a model error term.


next up previous contents
Next: Theoretical convergence results Up: Numerical experiments Previous: Layered quasi-geostrophic ocean model   Contents
Back to home page