next up previous contents
Next: Algorithm Up: Restoration Previous: Variational formulation   Contents

Topological gradient

From theorem 2.1, we can derive the following asymptotic expansion of the cost function (2.24):

$\displaystyle j(\rho)-j(0) = \rho^2 G(x_0,n) + o(\rho^2),$ (2.25)

where

$\displaystyle G(x_0,n) = -\pi c (\nabla u_0(x_0).n)(\nabla p_0(x_0).n)-\pi \vert\nabla u_0(x_0).n\vert^2,$ (2.26)

where $ p_0$ is the solution of the unperturbed adjoint problem:

$\displaystyle \left\{ \begin{array}{lll} -div(c\nabla p_0)+p_0=-\partial_u J(\O...
... & in & \Omega,\\ \partial_n p_0 = 0 & on & \partial\Omega. \end{array} \right.$ (2.27)

As previously seen, the topological gradient can be rewritten: $ G(x,n)=\langle M(x)n,n\rangle$ , where $ M(x)$ is the following $ 2\times 2$ symmetric matrix:

$\displaystyle M(x) = -\pi c \frac{\nabla u_0(x)\nabla p_0(x)^T+\nabla p_0(x)\nabla u_0(x)^T}{2}-\pi\nabla u_0(x)\nabla u_0(x)^T.$ (2.28)



Back to home page