### Gradient of a tensor in variational form

123

views

1

Hi all,

I'm having trouble computing the gradient of a tensor in the variational form.

$\frac{d}{dx_j}S_{ij}$

where Sij is the strain rate tensor .

I'm having trouble computing the gradient of a tensor in the variational form.

$\frac{d}{dx_j}S_{ij}$

`d``d``x`_{j}`S`_{ij}where Sij is the strain rate tensor .

```
def Sij(u):
return 0.5*(nabla_grad(u)+transpose(nabla_grad(u)))
V = VectorFunctionSpace(mesh,'CG',1,dim=3)
u = TrialFunction(V)
v = TestFunction(V)
u_n = Function(V)
u = Function(V)
a1 = dot(u,v)*dx
L1 = dot(u_n,v)*dx + dot(nabla_grad(Sij(u_n)),v)*dx)
solve(a1 == L1, u)
```

This is a shortened version of my variational form but this is my error

`ufl.log.UFLException: Can only integrate scalar expressions. The integrand is a tensor expression with value shape (3, 3) and free indices with labels ().`

Community: FEniCS Project

### 1 Answer

2

Hi

the second summand in your bilinear Form L1 translates to

$\int\frac{\partial S_{ij}}{\partial x_k}v_k\mathrm{d}x$∫∂

or symbollically

$\int\left(S\otimes\nabla\right)\cdot v\mathrm{d}x$∫(

and thus is a tensorial expression (two free indices). Every integral expression has to be scalar in any case.

What you want to achieve, i.e.

$\int\frac{\partial S_{ji}}{\partial x_i}v_j\mathrm{d}x$∫∂

is the dot product of the divergence of Sij and v. You can this write as

the second summand in your bilinear Form L1 translates to

$\int\frac{\partial S_{ij}}{\partial x_k}v_k\mathrm{d}x$∫∂

`S`_{ij}∂`x`_{k}`v`_{k}`d``x`or symbollically

$\int\left(S\otimes\nabla\right)\cdot v\mathrm{d}x$∫(

`S`⊗∇)·`v``d``x`and thus is a tensorial expression (two free indices). Every integral expression has to be scalar in any case.

What you want to achieve, i.e.

$\int\frac{\partial S_{ji}}{\partial x_i}v_j\mathrm{d}x$∫∂

`S`_{ji}∂`x`_{i}`v`_{j}`d``x`is the dot product of the divergence of Sij and v. You can this write as

`Sij[i,j].dx(j)*v[i]*dx`

But be careful. If Sij comes from a piecewise constant function space this won't work. Consider integration by parts.

1

What you wrote is the divergence of the tensor S, which is a vector. The expression you gave is $\frac{\partial S_{ij}}{\partial x_j}$∂

You'll want to write

`S`_{ij}∂`x`_{j}which has one free index. The dot product with your test function v would then yield a scalar expression you can integrate. Just your implementation is wrong, since nabla_grad gives $\frac{\partial S_{ij}}{\partial x_k}$∂`S`_{ij}∂`x`_{k}and is a tensor of rank three.You'll want to write

`Sij(u_n)[i,j].dx(j)*v[i]*dx`

written
3 months ago by
klunkean

I see it now, sorry! Thank you for your help!

written
3 months ago by
Luz Imelda Pacheco

1

No need to be sorry! Glad I could help :)

Oh and symbolically you can also write

Oh and symbolically you can also write

`dot(nabla_div(Sij),v)*dx`

written
3 months ago by
klunkean

Please login to add an answer/comment or follow this question.

[ S11 S12 S13; S21 S22 S23; S31 S32 S33]

grad = [d/dx d/dy d/dz]

----> [ d/dx(S11) + d/dy(S12) + d/dz(S13) ; d/dx(S21) + d/dy(S22) + d/dz(S23); d/dx(S31) + d/dy(S32) + d/dz(S33) ]

This is what I'm trying to achieve since I'm assuming there's an implicit summation

Thanks!