I’m currently running a case where if I use p=1 I can run the solution with a sporadic filtering (e.g 1000 steps) and it is stable while as soon as I increase to p=4 it can only progress in time if I start filtering every step. It is the first time I face such a condition.
Do you have any recommendation in what to look at? Diffusion is playing a key role in stabilising it but I need to avoid such a frequent filtering to reduce computational time.
It is hard to say without knowing more about your case and how it diverges. Things such as mesh spacing, if anti-aliasing is enabled, time stepping scheme, etc.
Solution filtering is known to interact poorly with adaptive time stepping so you may want to revert to a fixed time step in such cases.