I mean, if I run the same simulation using Output Interval = 1, and then I run the simulation using Output Interval = 100, will the Solver Time change by a significant amount?
I could run this simulation, but I find it useful to ask.
Do Output Intervals affect solver time?
-
- Posts: 57
- Joined: 14 Mar 2021, 21:01
- Antispam: Yes
-
- Posts: 2298
- Joined: 25 Jan 2019, 01:28
- Antispam: Yes
Re: Do Output Intervals affect solver time?
The total computer time will change because it takes time to do input/output. Most systems the IO time is significantly slower than compute time because drive speed has not keep up with CPU speed. The compute time should not change. I usually monitor this as compute time versus wall clock time.