This problem runs fine on 1 cpu, but with mpirun the contact surfaces do not work. Is this expected? Is there something missing?
Kevin
Contact Surfaces MPI
-
- Site Admin
- Posts: 4828
- Joined: 22 Aug 2009, 11:57
- Antispam: Yes
- Location: Espoo, Finland
- Contact:
Re: Contact Surfaces MPI
Hi
Well, you should order the contact surfaces to be on the same partition. We have been lazy and the mortar integration assumes that it finds the counterpart in the same partition. In ElmerGrid this is done with the "-connect bc_ids" flag.
For larger cases this is of course a bottle-neck. There are additional flags and developments that enable special halo elements such that the contact pairs are divided into a smaller number of partitions. However, you should be able to use at least a few cores with the 1st approach alone.
-Peter
Well, you should order the contact surfaces to be on the same partition. We have been lazy and the mortar integration assumes that it finds the counterpart in the same partition. In ElmerGrid this is done with the "-connect bc_ids" flag.
For larger cases this is of course a bottle-neck. There are additional flags and developments that enable special halo elements such that the contact pairs are divided into a smaller number of partitions. However, you should be able to use at least a few cores with the 1st approach alone.
-Peter
-
- Posts: 2311
- Joined: 25 Jan 2019, 01:28
- Antispam: Yes
Re: Contact Surfaces MPI
Thanks, good to know.
Kevin
Kevin
-
- Site Admin
- Posts: 4828
- Joined: 22 Aug 2009, 11:57
- Antispam: Yes
- Location: Espoo, Finland
- Contact:
Re: Contact Surfaces MPI
To be more specific, there are test cases "ContactPatch3D*" where you can see partitioning in runtest.cmake being done such that BCs 58 and 59 (the contact pair) is married with "-connect 58 59".
-Peter
-Peter