Failed to run example cases using openmp as backend

Did you try following Freddie’s advice about manually compiling libxsmm with the flags he specified?

Yes. Following the install guide, I have tried to build PyFR 1.14.0 from the source code and compile every dependent package from the source code. Every package was compiled successfully. But some errors happened when I ran the cases.

However, I am sorry that I did not save the error output. It should be something like “undefined symbol”. I am using a lower version of PyFR (1.12.3). When the simulation is over, I could try to reproduce the error output.

This error is usually a consequence of libxsmm not being compiled with the BLAS=0 flag.

Regards, Freddie.

I am sure that I have added that flag. I did it in this way:

git checkout 14b6cea61376653b2712e3eefa72b13c5e76e421
make install

I may try again this weekend.

I too am getting the same error when trying to run pyfr using openmp even though to the best of my understanding libxsmm has been compiled as instructed. I would be happy to understand if and what I am doing wrong.

On my system:

$ cd ~/Programming/libxsmm
$ make -j4 SHARED=1 NOBLAS=1
$ export PYFR_XSMM_LIBRARY_PATH=$HOME/Programming/libxsmm/lib/
$ cd ../PyFR/examples/euler_vortex_2d
$ pyfr run -bopenmp -p euler_vortex_2d.pyfrm euler_vortex_2d.ini
100.0% [====================================================================================>] 100.00/100.00 ela: 00:00:07 rem: 00:00:00

with PyFR being a straight check out of v1.14 and libxsmm being the current master revision. Also verified on an Apple M1 system.

Regards, Freddie.

Thank you.
now it works.

This way works for me, too. Thank you.

However, I found a strange problem. When using openmp, the cpu usage is always nearly full, even if I only use one core

pyfr run -b openmp -p inc_cylinder_2d.pyfrm inc_cylinder_2d.ini

(There are 24 cores on my PC, but the CPU usage was more than 90% when I used the above command.)

@fdw @EitanA Did you see such a problem?

Find an answer here.


Thank you all for your help.