On 02/06/16 11:06, Stephan Kramer wrote:
On 01/06/16 15:59, Floriane Gidel [RPG] wrote:
Dear all,
I am trying to understand how the gradient operator works on Firedrake and there is something I misunderstand.
As far as I understood, if I define (in the HORIZONTAL plane):
V = FunctionSpace(mesh,"CG", 1) and
W = VectorFunctionSpace(mesh,"CG", 1, dim=N)
such that :
psi_1 = Function (V)
psi_N = Function(W) (vector of dim (1,N))
then,
grad(psi_1) is a rank 1 tensor of 2 rows or columns (does not matter),
grad(psi_N) is a rank 2 tensor of dim (N, 2)
1) Is that correct ?
Assuming you're in 2D: yes
2) Then, I understand that we can compute grad(psi_N)*grad(psi_1) but not grad(psi_1)*grad(psi_N), because the dimensions agree in the first case but not in the second one.
However, when taking the transpose of grad(psi_N), I don't understand what happens.
The trouble with rank>=1 tensors is that there are many possible products: dot products, inner products, outer products, etc. So when you say I understand grad(psi_N)*grad(psi_1) you probably mean the dot-product. Have a look at chapter 4 of the Fenics manual: http://launchpad.net/fenics-book/trunk/final/+download/fenics-manual-2011-10... that describes the UFL language and explains the different tensor products.
Basically, you should only use * if one of the tensors is rank0 (a scalar). UFL for some reason does allow rank2*rank1 - presumably because people tend to be familiar with the matrix-vector product. Note that UFL does not check whether there is dimensional agreement (only rank) as this isn't always available at this level - it will only show up when you actually evaluate the expression.
If you like, you can always use summation convention to indicate the product you wish to take: i, j = indices(2) grad(psi_1)[i] * grad(psi_N)[j, i] contracts the index of the (2, )-tensor grad(psi_1) against the second index of the (N, 2)-tensor grad(psi_N) forming a (10, )-tensor that is then indexed with the free index j. Lawrence