I agree with @WillT, you obviously have an MPI problem. You should consult the HPC admins at your institution and ask what MPI version is recommended for your cluster (MPICH, MVAPICH, Intel MPI, Open MPI, etc.), along with any needed runtime flags. Note: you do not need to use Open MPI as shown in the installation guide.
MPI issues aside, I had other issues trying to work with the Scotch partitioner a few months back (Link to post. In the end I needed to specifically specify the Scotch partitioner with the -pscotch
flag, otherwise it would fall back to Metis even though the Scotch lib path was defined.