next up previous contents
Next: Remarks on the theoretical Up: Inviscid Burgers Previous: Inviscid Burgers   Contents

Remark:

For the special case $ K(t,x)=K(x)=K\mathbbm{1}_{[a,b]}(x)$ , where $ K$ is a constant and $ [a,b]$ is a sub-interval of $ [0,1]$ , we have

$\displaystyle w(T,\psi(T,x)) = w(0,x) \exp \left( -K \chi(x)-\displaystyle\int_0^T \partial_x u_{obs}(\sigma,\psi(\sigma,x))d\sigma \right),$ (3.45)

where

$\displaystyle \chi(x) = \displaystyle\int_0^T \mathbbm{1}_{Supp(K)}(\psi(\sigma,x)) d\sigma$ (3.46)

is the time during which the characteristic curve $ \psi(\sigma,x)$ with foot $ x$ of equation (3.39-F) with $ K=0$ lies in the support of $ K$ .

The system is then observable if and only if the function $ \chi$ has a non-zero lower bound, i.e. $ m := \displaystyle \min_{x} \chi(x) > 0$ , the observability being defined by (see e.g. [93]):

% latex2html id marker 7176
$\displaystyle \exists C, \forall u \textrm{ soluti...
...with } K=0,\quad \Vert u(T,.)\Vert^2 \le C\int_0^T \Vert K(.)u(s,.)\Vert^2\,ds.$ (3.47)

In this case, proposition 3.2 proves the global exponential decrease of the error, provided $ K$ is larger than $ \displaystyle \frac{MT}{m}$ , where $ M$ is defined by equation (3.41).

From this remark, we can easily deduce that if for each iteration, both in the forward and backward integrations, the observability condition is satisfied, then the algorithm converges and the error decreases exponentially to 0 . Note that this is not a necessary condition, as even if $ \chi(x)=0$ , the last exponential of equation (3.45) is bounded.


next up previous contents
Next: Remarks on the theoretical Up: Inviscid Burgers Previous: Inviscid Burgers   Contents
Back to home page