Some artefacts (nodal forces) at interchange nodes with mpi run

Post processing utility for Elmer
Post Reply
mabor
Posts: 2
Joined: 06 Jul 2020, 23:17
Antispam: Yes

Some artefacts (nodal forces) at interchange nodes with mpi run

Post by mabor »

Hello,

I'm interested in magnetic forces induced by current carrying wires and coils. I made a 2D-test case (MgDyn2D) with two small loops and played with the parallelisation utility mpi. For some reason, there exist some artefacts in the nodal forces field. These are obviously located at the interchange nodes between the partitioned surfaces/lines (see picture). The same case run by a single process results in an unremarkable solution.
Is this a known issue produced by my immature Elmer skills or a potential parallelisation bug?

Regards,
Marcel

PS: Parallelisation commands

Code: Select all

ElmerGrid 14 2 twowires.msh -2d -out ./ -partition 1 5 1 0
mpirun -np 5 ElmerSolver_mpi 
Attachments
twowires.geo
Geometry file for Gmsh
(969 Bytes) Downloaded 363 times
case.sif
(3.77 KiB) Downloaded 359 times
nodal_nodes_artefact.png
nodal_nodes_artefact.png (106.15 KiB) Viewed 6325 times
raback
Site Admin
Posts: 4812
Joined: 22 Aug 2009, 11:57
Antispam: Yes
Location: Espoo, Finland
Contact:

Re: Some artefacts (nodal forces) at interchange nodes with mpi run

Post by raback »

Hi

I must say that I don't remember out of heart how the nodal forces are treated in parallel. Basically we could have them distributed such that at the intefaces they sum up to zero (or close to). The shared nodes get contributions from both sides of the interface. I guess you get conflicting results?

-Peter
mabor
Posts: 2
Joined: 06 Jul 2020, 23:17
Antispam: Yes

Re: Some artefacts (nodal forces) at interchange nodes with mpi run

Post by mabor »

Hi Peter,

besides the missing weighting factor of 2*Pi for cylindrically symmetric cases (see your post: viewtopic.php?f=3&t=6959&start=10#p22098), the resulting component magnetic forces seems to be slightly different to the single process result but acceptable (Good results are
archived, if the distance between the loops <=1, for larger distances the mesh resultion must be better, I guess). I would say this is due to the fact, that there is virtually only one line with small errors within a wire.
The situation is different when looking at the vacuum component: As the picture indicates the resulting force in y-direction is about two orders of magnitudes different compared to the single process result (which is virtually close to zero). And this is -I would say- an absolute unphysical result. (Additional to this, there exist a resulting x-force at all components in both, parallel and single processed cases. This I do really not understand, but this is another issue.)

I compared the results for a larger test case including two thick coils. The results in y-direction are different but comparable. Here, the missing weighting factor is blowing up the correct solution, I guess.

But here comes the real problem:
If the nodal forces are coupled to the linear elasticity solver via Displacement Loads, the solver might not reach convergence if these sharing artefacts are accepted. In the small test cases the solution converges, but it takes much more iterations. In a huge case, it does not converge.

Regards,
Marcel
Post Reply