Contact Surfaces MPI

Numerical methods and mathematical models of Elmer
Post Reply
kevinarden
Posts: 2311
Joined: 25 Jan 2019, 01:28
Antispam: Yes

Contact Surfaces MPI

Post by kevinarden »

This problem runs fine on 1 cpu, but with mpirun the contact surfaces do not work. Is this expected? Is there something missing?
solid.sif
(2.9 KiB) Downloaded 249 times
Kevin
raback
Site Admin
Posts: 4828
Joined: 22 Aug 2009, 11:57
Antispam: Yes
Location: Espoo, Finland
Contact:

Re: Contact Surfaces MPI

Post by raback »

Hi

Well, you should order the contact surfaces to be on the same partition. We have been lazy and the mortar integration assumes that it finds the counterpart in the same partition. In ElmerGrid this is done with the "-connect bc_ids" flag.

For larger cases this is of course a bottle-neck. There are additional flags and developments that enable special halo elements such that the contact pairs are divided into a smaller number of partitions. However, you should be able to use at least a few cores with the 1st approach alone.

-Peter
kevinarden
Posts: 2311
Joined: 25 Jan 2019, 01:28
Antispam: Yes

Re: Contact Surfaces MPI

Post by kevinarden »

Thanks, good to know.

Kevin
raback
Site Admin
Posts: 4828
Joined: 22 Aug 2009, 11:57
Antispam: Yes
Location: Espoo, Finland
Contact:

Re: Contact Surfaces MPI

Post by raback »

To be more specific, there are test cases "ContactPatch3D*" where you can see partitioning in runtest.cmake being done such that BCs 58 and 59 (the contact pair) is married with "-connect 58 59".

-Peter
Post Reply