### How to apply BC after using df.assemble_local (element level assembly routine)?

307
views
0
8 months ago by
I am using dolphin::assemble_local to build the global assembly matrix (A) from local_assembly (A_local) (for reference the code is pasted below from another).

I am having diificulty in applying Boundary Conditions (BC) on the final global assembly matrix (A).

I tried "bc.apply(A)" but get the following error:
TypeError: in method 'DirichletBC_apply', argument 2 of type 'dolfin::GenericVector &'

Any suggestions?

##################################################
# Define model problem
import numpy as np
import dolfin as df

mesh = df.RectangleMesh(df.Point(0,0), df.Point(1,1), 4, 4)
V = df.FunctionSpace(mesh, 'Nedelec 1st kind H(curl)', 2)
u = df.TrialFunction(V)
v = df.TestFunction(V)

# Assemble global matrix from local matrices
A = np.zeros([V.dim(), V.dim()])

for cell in df.cells(V.mesh()):
A_local = df.assemble_local(form, cell)
local_to_global_map = V.dofmap().cell_dofs(cell.index())

# Copy local matrix into global matrix
for (idx_loc_1, idx_glob_1) in enumerate(local_to_global_map):
for (idx_loc_2, idx_glob_2) in enumerate(local_to_global_map):
A[idx_glob_1, idx_glob_2] += A_local[idx_loc_1, idx_loc_2]

Thanks,
-AJ

Community: FEniCS Project
numpy arrays are not part of dolfin's linear algebra backends.

python3 -c "from dolfin import *; print(linear_algebra_backends())"​

you will have to modify your (dense) numpy array yourself to apply the boundary conditions.

written 8 months ago by Nate
Hi Nate,
Thanks for your comment.

By modifying my (dense) numpy array you mean convert it back to PETScMat type or something else?

-AJ
written 8 months ago by AJ

1
8 months ago by
Hi,

here a MWE (based on demo_poisson.py example) for combining assemble_local and applying DirichletBC's. Please note that a dummy problem is introduced in order to initialize the lay-out of the global system, thus enabling a manual assembling into GenericMatrix/GenericVector object.
Hope this MWE features the desired functionalities.

"""
Illustrating assemble_local using demo_poisson.py demo
"""

from dolfin import *
import numpy as np

# Set ghost mode for correct assembling in parallel
parameters['ghost_mode'] = 'shared_facet'

# Create mesh and define function space
mesh = UnitSquareMesh(32, 32)
V = FunctionSpace(mesh, "Lagrange", 1)
uh= Function(V)

# Define Dirichlet boundary (x = 0 or x = 1)
def boundary(x):
return x[0] < DOLFIN_EPS or x[0] > 1.0 - DOLFIN_EPS

# Define boundary condition
u0 = Constant(0.0)
bc = DirichletBC(V, u0, boundary)

# Define variational problem
u, v = TrialFunction(V), TestFunction(V)
f = Expression("10*exp(-(pow(x[0] - 0.5, 2) + pow(x[1] - 0.5, 2)) / 0.02)", degree=2)
g = Expression("sin(5*x[0])", degree=2)

# Define forms to be assembled (crucial to turn into Forms in parallel)
L = Form(f*v*dx + g*v*ds)

# Dummy problem to define LAYOUT of global problem (the easy way)
f_g = assemble( Constant(0.)*v*dx )

# Get dofmap to construct cell-to-dof connectivity
dofmap = V.dofmap()

# Perform assembling
for cell in cells(mesh):
dof_idx = dofmap.cell_dofs(cell.index())

# Assemble local rhs and lhs
a_local  = assemble_local(a, cell)
L_local  = assemble_local(L, cell)

# Assemble into global system

# Finalize assembling

# APPLY BC
bc.apply(A_g,f_g)

# Compute solution
solve(A_g, uh.vector(), f_g)

# Save solution in VTK format
File("poisson.pvd") << uh​

Hi Jakob,

The code is working great!

I will test the procedure in my application, I think it will solve my issues.

Why is this step necessary?
# Dummy problem to define LAYOUT of global problem (the easy way)
f_g = assemble( Constant(0.)*v*dx )

Thanks again!

AJ

written 7 months ago by AJ
Perfect!
Those dummy problems essentially create a blueprint for your global matrix (and vector), i.e. describing the size and the sparsity pattern of the global system. This step is necessary in order to send the entries of the local matrices to the corresponding indices in the global (sparse) matrix during the assemble_local step.

Now, to be complete:
1) I guess you can skip the dummy assembling of f_g, and simply initialize f_g as a Vector() with size N.
2) You can initialize an empty Matrix with the correct sparsity pattern etc. using Matrix().init(TensorLayout() ), but specifying the correct TensorLayout is rather cumbersome.
written 7 months ago by Jakob Maljaars
0
8 months ago by
The matrix has the wrong type. You can try to convert it back to a dolfin matrix by

B = Matrix(PETScMatrix(A))

as suggested by pf4d in your old thread

Hi Lukas,
Thanks for your suggestion.

B = Matrix(PETScMatrix(A)): Didn't work,
It gives the following error:

Traceback (most recent call last):

File "demo_localAssembly_poissonModified2.py", line 53, in <module>

C = Matrix(PETScMatrix(A))

File "/home/fenics/build/lib/python2.7/site-packages/dolfin/cpp/la.py", line 2324, in __init__

_la.PETScMatrix_swiginit(self, _la.new_PETScMatrix(*args))

TypeError: in method 'new_PETScMatrix', argument 1 of type 'MPI_Comm'

---------------------------------

Also, the code (suggested by pf4d in my old thread):

# assemble global contribution of local tensors :
print "LOCAL STIFFNESS:\n"
for i in range(mesh.num_cells()):
print A_i.array(), "\n"
A_m = as_backend_type(A_i).mat() # add local petsc matrix
A_a.append(A_m)

A = np.sum(np.array(A_a))

# convert back to dolfin matrix :
C = Matrix(PETScMatrix(A))

Here, I could able to build B = Matrix(PETScMatrix(A)), but applying BC on B didn't work as well.
written 8 months ago by AJ
My approach in such situations is to assemble the local matrices directly into a GenerixMatrix object (of course having the proper layout). Applying Dirichlet boundary conditions is trivial in that case.

If you are interested, I can make you a MWE somewhere next week.
written 8 months ago by Jakob Maljaars
Hi Jakob,

If there is a way to directly assemble from local to global in GenericMatrix objects that would really help me a lot.

I appreciate MWE for that.

Thanks,
AJ
written 8 months ago by AJ