MPI deadlock when creating FunctionSpace with periodic boundary conditions


93
views
1
9 weeks ago by
I'm trying to solve a nonlinear PDE system on a square 2D domain. Here is some of the relevant code:

import petsc4py, sys, os
#
# this needs to be done before importing PETSc
#
petsc4py.init(args=(sys.argv[0:1] + commandlineArguments.petsc))
from KSDG import *
import fenics as fe
import numpy as np
from petsc4py import PETSc
fe.parameters.parse(sys.argv[0:1] + commandlineArguments.fenics)
fe.parameters['ghost_mode'] = 'shared_facet'
from mpi4py import MPI

...
def main():
    ...
    periodic = commandlineArguments.periodic
    t0 = 0.0
    domain = pbdomain(periodic=periodic, dim=dim, width=width)
    logMAIN('domain', domain)
    Cd1 = fe.FunctionSpace(mesh, 'CG', degree,
                           constrained_domain=domain)
    logMAIN('Cd1', Cd1)
...

pbdomain is imported by 'from KSDG import *'. The relevant part of the relevant file is:
class PeriodicDomain1D(SubDomain):
...

class PeriodicDomain2D(SubDomain):
    def __init__(self, dim=None, left=0.0, right=1.0, bottom=0.0,
                 top=1.0, tol=fe.DOLFIN_EPS):
        """dim argument allowed for convenience, but ignored"""
        super().__init__()
        self.tol = tol
        self.left = left
        self.right = right
        self.width = right - left
        self.bottom = bottom
        self.top = top
        self.height = top-bottom

    def inside(self, x, on_boundary):
        return bool(
            on_boundary and
            (fe.near(x[0], self.left, eps=self.tol) or
             fe.near(x[1], self.bottom, eps=self.tol)) and
            (not (fe.near(x[1], self.top, eps=self.tol) or
                  fe.near(x[0], self.right, eps=self.tol)))
        )

    def map(self, x, y):
        # y[0] = x[0]
        # y[1] = x[1]
        if (fe.near(x[0], self.right, eps=self.tol) and
            fe.near(x[1], self.top, eps=self.tol)):
            y[0] = self.left
            y[1] = self.bottom
        elif fe.near(x[0], self.right):
            y[0] = self.left
            y[1] = x[1]
        elif fe.near(x[1], self.top):
            y[0] = x[0]
            y[1] = self.bottom
        else:                             # just map to somewhere off boundary
            y[0] = self.left - self.width
            y[1] = self.bottom - self.height
        logPERIODIC('x, y', x, y)
        if any(np.isnan(y)):
            logPERIODIC('map returns nan: x, y', x, y)
            raise KSDGException('PeriodicDomain2D.map returns nan')
        return

class PeriodicDomain3D(SubDomain):
...

def pbdomain(
    periodic=False,
    dim=1,
    width=1.0
    ):
    """pbdomain -- constrained_domain for periodic boundary conditions
    Arguments:
    periodic=False: whether to use periodic boundary
        conditions. Returns None if periodic is False
    dim=1: Number of dimensions
    width=1.0: linear dimension of domain
    """
    if periodic:
        if dim == 1:
            domain = PeriodicDomain1D(dim=dim, left=0.0, right=width)
        elif dim == 2:
            domain = PeriodicDomain2D(dim=dim, 
                                      left=0.0, right=width,
                                      bottom=0.0, top=width)
        elif dim == 3:
            domain = PeriodicDomain3D(dim=dim, 
                                      left=0.0, right=width,
                                      bottom=0.0, top = width,
                                      below = 0.0, above = width)
    else:
        domain = None
    return(domain)


The logMAIN, logPERIODIC calls log their arguments when an environment variable is appropriately set. Because of this I know that, when periodic is True, and I run with mpiexec -n 4, the call to create Cd1 never returns. The logging shows that PeriodicDomain2D.map is called many times during this call and always sets y to what you would think it ought to (or at least, what *I* would think it ought to).

If I run this as a single sequential process, it seems to work as expected. Also, if I run it with periodic=False, it likewise behaves as expected. Only when run under MPI with periodic boundary conditions does it deadlock.

Any insights would be appreciated. Thanks.

Community: FEniCS Project
@Leon:  I have had problems with MPI and periodic BC for a while and posted a question a few months ago:
https://www.allanswered.com/post/xnqop/mpi-segmentation-fault-with-c-demo-periodic/

There is no response to this on this board to date for my issue (and also for others on this topic), which may mean this is in the works for a next release, but no one seems to have any information on this.
written 9 weeks ago by jwinkle  
Thanks. I saw your question, and also I saw this issue, which looks unrelated but perhaps is not: periodic-boundary-conditions-for-dg.

In the meantime, I think I have figured out an alternative way to handle PBCs, by expressing my function as a sum of an even function (with Neumann BCs on half the domain, or one quarter in 2D) and an odd function with Dirichlet BCs.
written 9 weeks ago by Leon Avery  
Please login to add an answer/comment or follow this question.

Similar posts:
Search »