Simulation with Mesh Partitioning using SCOTCH

I am trying to use PyFR with SCOTCH on my University Compute Cluster. Since I do not have sudo rights, I am not able to use the sudo apt install command. Hence, I have to install SCOTCH from their GitHub repo code.

I followed the instructions in the INSTALL.txt file on the SCOTCH repo, and tried using installation using CMake.

I created a Virtual Environment for my pyfr installation:

module load python/3.9.6
python3 -m venv pyfr
source pyfr/bin/activate

module load openmpi
module load paraview
module load gcc/10
module load cuda
cd pyfr

I typed the following commands after cloning the SCOTCH repo:

mkdir build
cd build
cmake -DCMAKE_INSTALL_PREFIX=/path/to/install/ -DCMAKE_BUILD_TYPE=Release ..

make -j 128 

env CTEST_OUTPUT_ON_FAILURE=1 ctest

make install

I installed pyfr using pip:

pip install pyfr

The documentation says to specify the SCOTCH path using PYFR_SCOTCH_LIBRARY_PATH=/path/to/libscotch.so, but there is no .so file in my SCOTCH/lib folder.

In the euler_vortex_2d folder from PyFR examples, I run the following commands (as mentioned on the PyFR website:

pyfr import euler_vortex_2d.msh euler_vortex_2d.pyfrm
pyfr partition 2 euler_vortex_2d.pyfrm .

The partition command does not create any files (and I am not sure if it should create any more files). Regardless, I execute the command:

mpiexec -n 2 pyfr run -b cuda -p euler_vortex_2d.pyfrm euler_vortex_2d.ini

I get an error as shown below:

When I run PyFR without partition, it runs correctly and produced .pyfrs files:

pyfr import euler_vortex_2d.msh euler_vortex_2d.pyfrm
pyfr run -b cuda -p euler_vortex_2d.pyfrm euler_vortex_2d.ini

Note that in the case without partitioning, even if I run mpiexec -n 1 pyfr run -b cuda -p euler_vortex_2d.pyfrm euler_vortex_2d.ini , I get a similar error as in the above image.

I am not sure if this is the case because of an error with OpenMPI or PyFR.

If I have followed what you’ve done correctly, the libscotch.so should be located in /path/to/install/lib.

But given that pyfr didn’t throw an error when you ran pyfr partition ... this is not likely to be the issue.

It seems that there is an issue with your MPI installation. This first thing to check is is the MPI version module you loaded the same as the one you had loaded when you did pip install pyfr? This is becuase a vesion mismatch between the current version and the version used by mpi4py can cause issue. (In case you didn’t know you can do module save ... and module restore ... to save and restore modules between sessions. I normally have a module save named the same as the venv.)

Thank you for your answer.

Actually there is no file with the extension .so. So I am not sure which path to specify for PYFR_SCOTCH_LIBRARY_PATH. I think this is the main issue because of which the partitioning is not working, but I’m not sure.

I checked the versions for OpenMPI and mpi4py before and after pyfr installation. Both versions are at 3.1.3, so I don’t think there is an issue with the version mismatch.

So when you look in the lib or lib64 directory within the directory you pointed to for the prefix there are no libscotch files? Did you definitely run make install?

Can you try a simple mpi4py test? Something like:

from mpi4py import MPI

if __name__ == '__main__':
    comm = MPI.COMM_WORLD
    print(f'rank {comm.rank} of {comm.size}')

run with mpiexec -n 4 python mpi_test.py

There is a libscotch.a file, but not libscotch.so file in the lib folder.

I tried running your script, but I got the error:

Ok, so when you run cmake you need to enable shared and remake scotch.

If that simple python scipt doesn’t work then it looks like there might be an issue with your MPI installation.

I agree with @WillT, you obviously have an MPI problem. You should consult the HPC admins at your institution and ask what MPI version is recommended for your cluster (MPICH, MVAPICH, Intel MPI, Open MPI, etc.), along with any needed runtime flags. Note: you do not need to use Open MPI as shown in the installation guide.

MPI issues aside, I had other issues trying to work with the Scotch partitioner a few months back (Link to post. In the end I needed to specifically specify the Scotch partitioner with the -pscotch flag, otherwise it would fall back to Metis even though the Scotch lib path was defined.