Error Running PyFR Across Multiple Nodes

I am running PyFR completely fine when running with 4 GPUs on one node. However, when I am trying to access 20 GPUs across 10 different nodes, I am getting errors. I am using cuda-aware and local-rank in my [backend-cuda]. Is there something else I need to do when running across multiple nodes? Here is the error output: GpuFreq=control_disabled
GpuFreq=control_disabled
GpuFreq=control_disabled
GpuFreq=control_disabled
GpuFreq=control_disabled
GpuFreq=control_disabled
GpuFreq=control_disabled
GpuFreq=control_disabled
GpuFreq=control_disabled
GpuFreq=control_disabled
[p0326.ten.osc.edu:mpi_rank_4][rdma_open_hca] User specified HCA mlx5_2 does not have an IP address. Disabling RDMA_CM based multicast.
[p0326.ten.osc.edu:mpi_rank_5][rdma_open_hca] User specified HCA mlx5_2 does not have an IP address. Disabling RDMA_CM based multicast.
[p0324.ten.osc.edu:mpi_rank_1][rdma_open_hca] User specified HCA mlx5_2 does not have an IP address. Disabling RDMA_CM based multicast.
[p0330.ten.osc.edu:mpi_rank_13][rdma_open_hca] User specified HCA mlx5_2 does not have an IP address. Disabling RDMA_CM based multicast.
[p0330.ten.osc.edu:mpi_rank_12][rdma_open_hca] User specified HCA mlx5_2 does not have an IP address. Disabling RDMA_CM based multicast.
[p0327.ten.osc.edu:mpi_rank_7][rdma_open_hca] User specified HCA mlx5_2 does not have an IP address. Disabling RDMA_CM based multicast.
[p0329.ten.osc.edu:mpi_rank_11][rdma_open_hca] User specified HCA mlx5_2 does not have an IP address. Disabling RDMA_CM based multicast.
[p0325.ten.osc.edu:mpi_rank_3][rdma_open_hca] User specified HCA mlx5_2 does not have an IP address. Disabling RDMA_CM based multicast.
[p0335.ten.osc.edu:mpi_rank_17][rdma_open_hca] User specified HCA mlx5_2 does not have an IP address. Disabling RDMA_CM based multicast.
[p0331.ten.osc.edu:mpi_rank_15][rdma_open_hca] User specified HCA mlx5_2 does not have an IP address. Disabling RDMA_CM based multicast.
[p0336.ten.osc.edu:mpi_rank_19][rdma_open_hca] User specified HCA mlx5_2 does not have an IP address. Disabling RDMA_CM based multicast.
[p0329.ten.osc.edu:mpi_rank_10][rdma_open_hca] User specified HCA mlx5_2 does not have an IP address. Disabling RDMA_CM based multicast.
[p0336.ten.osc.edu:mpi_rank_18][rdma_open_hca] User specified HCA mlx5_2 does not have an IP address. Disabling RDMA_CM based multicast.
[p0335.ten.osc.edu:mpi_rank_16][rdma_open_hca] User specified HCA mlx5_2 does not have an IP address. Disabling RDMA_CM based multicast.
[p0325.ten.osc.edu:mpi_rank_2][rdma_open_hca] User specified HCA mlx5_2 does not have an IP address. Disabling RDMA_CM based multicast.
[p0328.ten.osc.edu:mpi_rank_8][rdma_open_hca] User specified HCA mlx5_2 does not have an IP address. Disabling RDMA_CM based multicast.
[p0328.ten.osc.edu:mpi_rank_9][rdma_open_hca] User specified HCA mlx5_2 does not have an IP address. Disabling RDMA_CM based multicast.
[p0327.ten.osc.edu:mpi_rank_6][rdma_open_hca] User specified HCA mlx5_2 does not have an IP address. Disabling RDMA_CM based multicast.
[p0324.ten.osc.edu:mpi_rank_0][rdma_open_hca] User specified HCA mlx5_2 does not have an IP address. Disabling RDMA_CM based multicast.
[p0331.ten.osc.edu:mpi_rank_14][rdma_open_hca] User specified HCA mlx5_2 does not have an IP address. Disabling RDMA_CM based multicast.
WARNING: Error in initializing MVAPICH2 ptmalloc library.Continuing without InfiniBand registration cache support.
[p0324.ten.osc.edu:mpi_rank_0][error_sighandler] Caught error: Segmentation fault (signal 11)
[p0336.ten.osc.edu:mpi_rank_19][error_sighandler] Caught error: Segmentation fault (signal 11)
[p0324.ten.osc.edu:mpi_rank_1][error_sighandler] Caught error: Segmentation fault (signal 11)
[p0336.ten.osc.edu:mpi_rank_18][error_sighandler] Caught error: Segmentation fault (signal 11)
[p0326.ten.osc.edu:mpi_rank_5][error_sighandler] Caught error: Segmentation fault (signal 11)
[p0330.ten.osc.edu:mpi_rank_13][error_sighandler] Caught error: Segmentation fault (signal 11)
[p0327.ten.osc.edu:mpi_rank_7][error_sighandler] Caught error: Segmentation fault (signal 11)
[p0328.ten.osc.edu:mpi_rank_8][error_sighandler] Caught error: Segmentation fault (signal 11)
[p0330.ten.osc.edu:mpi_rank_12][error_sighandler] Caught error: Segmentation fault (signal 11)
[p0327.ten.osc.edu:mpi_rank_6][error_sighandler] Caught error: Segmentation fault (signal 11)
[p0325.ten.osc.edu:mpi_rank_2][error_sighandler] Caught error: Segmentation fault (signal 11)
[p0328.ten.osc.edu:mpi_rank_9][error_sighandler] Caught error: Segmentation fault (signal 11)
[p0331.ten.osc.edu:mpi_rank_14][error_sighandler] Caught error: Segmentation fault (signal 11)
[p0335.ten.osc.edu:mpi_rank_16][error_sighandler] Caught error: Segmentation fault (signal 11)
[p0335.ten.osc.edu:mpi_rank_17][error_sighandler] Caught error: Segmentation fault (signal 11)
[p0331.ten.osc.edu:mpi_rank_15][error_sighandler] Caught error: Segmentation fault (signal 11)
[p0326.ten.osc.edu:mpi_rank_4][error_sighandler] Caught error: Segmentation fault (signal 11)
[p0325.ten.osc.edu:mpi_rank_3][error_sighandler] Caught error: Segmentation fault (signal 11)
[p0329.ten.osc.edu:mpi_rank_11][error_sighandler] Caught error: Segmentation fault (signal 11)
[p0329.ten.osc.edu:mpi_rank_10][error_sighandler] Caught error: Segmentation fault (signal 11)
srun: error: p0328: tasks 8-9: Segmentation fault (core dumped)
srun: error: p0336: task 18: Segmentation fault (core dumped)
srun: error: p0325: task 2: Segmentation fault (core dumped)
srun: error: p0335: task 17: Segmentation fault (core dumped)
srun: error: p0326: task 4: Segmentation fault (core dumped)
srun: error: p0325: task 3: Segmentation fault (core dumped)
srun: error: p0335: task 16: Segmentation fault (core dumped)
srun: error: p0330: tasks 12-13: Segmentation fault (core dumped)
srun: error: p0336: task 19: Segmentation fault (core dumped)
srun: error: p0331: tasks 14-15: Segmentation fault (core dumped)
srun: error: p0326: task 5: Segmentation fault (core dumped)
srun: error: p0327: tasks 6-7: Segmentation fault (core dumped)
srun: error: p0324: tasks 0-1: Segmentation fault (core dumped)
srun: error: p0329: tasks 10-11: Segmentation fault (core dumped)

Does it work with mpi-type = standard?

Regards. Freddie.

No, same exact error.

Please try with either OpenMPI or MPICH (being sure to recompile mpi4py once you change out your MPI library).

Regards, Freddie.