Hi everybody,
My question is related to the data interpolation between different meshes for parallel simulations. The problem deals with a 3D thermal flow around a rectangular cylinder built in a rectangular channel.
I have obtained a solution of the problem after more than two weeks of simulation and it have been running on cluster with eigth processors and more than 9 Gb RAM. Now I want to solve the same problem but with a finer mesh. To do that, I did a seeking into the manuals and in the Elmer discussion forum, I have found two different ways:
·The topic Interpolation onto reference grid
· ResultToResult program
I have tried both ways and they work fine for secuencial simulations but they crashed when I use them with parallel simulations.
Then, am I doing something wrong or the interpolation between diferent meshes is not posible in Elmer??
Thanks in advance,
Marcos
Interpolation onto reference grid for parallel cases
-
- Site Admin
- Posts: 4841
- Joined: 22 Aug 2009, 11:57
- Antispam: Yes
- Location: Espoo, Finland
- Contact:
Re: Interpolation onto reference grid for parallel cases
Unfortunately this does not, to my knowledge, work in parallel.
Re: Interpolation onto reference grid for parallel cases
Hi,
searching the repository I've found some revisions of the ResultToResult standalone program implemented after this post which mention the addition of parallel capabilities, but AFAIK there's no documentation neither on the serial nor parallel funtionalities. Is this program actually working with partitioned meshes? and, if it's so, can someone, please, give me some guidelines on how to use it?
Thanks in advance,
Cesar
searching the repository I've found some revisions of the ResultToResult standalone program implemented after this post which mention the addition of parallel capabilities, but AFAIK there's no documentation neither on the serial nor parallel funtionalities. Is this program actually working with partitioned meshes? and, if it's so, can someone, please, give me some guidelines on how to use it?
Thanks in advance,
Cesar