Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Standard Python solvers for the Fluid Dynamics Application #871

Closed
4 tasks done
jcotela opened this issue Oct 20, 2017 · 20 comments · Fixed by #4929
Closed
4 tasks done

Standard Python solvers for the Fluid Dynamics Application #871

jcotela opened this issue Oct 20, 2017 · 20 comments · Fixed by #4929

Comments

@jcotela
Copy link
Member

jcotela commented Oct 20, 2017

We need to do a bit of clean-up of the FluidDynamicsApplication python solvers: right now we have a lot of them and abundant duplication. I think this would also be a good opportunity to settle on a common solver class and a uniform way of dealing with processes or, at the very least, have a set of "clean and nice" solvers that work from GiD.

Tasks:

  • Identify minimal set of needed python solvers for the different combinations: monolithic/embedded/fractional step, openMP/MPI...

  • Ensure that all "clean" solvers derive from the base fluid_solver class, modifying it if needed.

  • Ensure that this minimal set of solvers work properly from the GiD interface.

  • Remove as many of the "old" solvers as possible.

@pooyan-dadvand
Copy link
Member

👍 for adding this for the next release!

@jcotela
Copy link
Member Author

jcotela commented Oct 26, 2017

Hi guys, I'm realizing that this is a very ambitious change for a week (the planned date for the fork), so I'm dropping it from the next Release. I'd postpone it to the eventual release 5.3. :(

@jcotela jcotela removed this from the Release 5.2 milestone Oct 26, 2017
@jcotela jcotela added Cleanup and removed Release labels Oct 26, 2017
@philbucher
Copy link
Member

may I add #1090 to the list? :)

@philbucher
Copy link
Member

and maybe what I did in StructuralMechanics: Having the possibility to define Variables in ProjectParams that are to be added to the ModelPart, see #1113 (see auxiliary_variables_list)

@jcotela
Copy link
Member Author

jcotela commented Dec 11, 2017

Hi @philbucher I'll look into it. However, I'd prefer a solution in the line of #869, at least for utilities or processes that will always need the same variables.

@philbucher
Copy link
Member

hi @jcotela
it is just a suggestion, it might not be relevant for the Fluids.

Just as an example of when I need it: Some elements in the StructuralMechanicsApp can compute results for variables that are not added by the solver by default.
Those I can add now through ProjectParams to ask for results in the outputprocess.

@jcotela
Copy link
Member Author

jcotela commented Dec 11, 2017

@philbucher I agree that this is needed, also for the fluids. And your solution is the only one that works as the code is now. Still, I think it is the job of the output process to read those variables and take care that they have been added, as this would make the code more modular.

@philbucher
Copy link
Member

Now I understood, the output process is treated in the same way!
Somehow I haven't seen thos before

But yes, it makes absolute sense (I will revert my changes then once the adding is working)

@jcotela
Copy link
Member Author

jcotela commented Jun 5, 2018

Just for future reference, this is the proposed list of solvers:

  • Base python fluid solver class FluidSolver. This is being implemented in Fluid/feature python solver derivation #2262.
  • Monolithic incompressible fluid solver, currently called NavierStokesSolverVMSMonolithic. It already supports other stabilizations, so the name should be changed. Ideally, it should also support different time schemes (BDF2/Bossak/"internal" element-based time integration)
  • Monolithic compressible flow solver, currently called NavierStokesCompressibleSolver.
  • Embedded incompressible fluid solver. Now we have NavierStokesEmbeddedMonolithicSolver, NavierStokesEmbeddedMonolithicAusasSolver and NavierStokesEmbeddedMonolithicFMALESolver. Ideally, we'd merge the different formulations with a trick similar to what we do to choose the stabilized formulation in the body-fitted monolithic solver, but we'll see how it can be best done.
  • Fractional step incompressible flow solver, currently NavierStokesSolverFractionalStep.
  • Trilinos version of NavierStokesSolverVMSMonolithic (currently TrilinosNavierStokesSolverMonolithic)
  • Trilinos version of the embedded solver(s). Currently we have TrilinosNavierStokesEmbeddedMonolithicSolver and TrilinosNavierStokesEmbeddedMonolithicAusasSolver.
  • An up-to-date version of the two-fluid solver.
  • Trilinos verision of the two-fluid solver?
  • Trilinos version of the compressible solver?
  • Stokes solver. This one could potentially fit as a different "formulation" for the monolithic incompressible solver.
  • Trilinos stokes solver. This one could fit as a variant formulation of the trilinos monolithic solver.
  • Adjoint sensitivity solver: porting adjoint_vmsmonolithic_solver to new framework.
  • Trilinos adjoint solver. MPI version of the adjoint sensitivity solver (currently not available).

All current solvers not on the list will be removed in the final stage of the clean-up. Most of them are outdated or duplicates, so no features should be lost.

@jcotela
Copy link
Member Author

jcotela commented Jun 19, 2018

Note to self/to-do: default settings for different solvers should have "wrong" values when they must be set from outside (domain_size is an obvious example, which is currently initialized to 2 in most solvers).

@philbucher philbucher added this to the Release 6.1 milestone Dec 27, 2018
@philbucher
Copy link
Member

I think it would be nice to have this for the release

@jcotela jcotela removed this from the Release 7.0 milestone Feb 1, 2019
@philbucher
Copy link
Member

Just to add that it might me worth changing the default mpi-solver from Trilinos-MultiLevel to AMGCL (FYI @ddemidov ) since @AndreasWinterstein found out in his recent scaling-studies that it is much faster

Also then the same solver would be used in serial and MPI

@RiccardoRossi
Copy link
Member

RiccardoRossi commented Feb 5, 2019 via email

@pooyan-dadvand
Copy link
Member

kudos @ddemidov!

@philbucher
Copy link
Member

adding #4390 to the list

@philbucher
Copy link
Member

adding #4736 to the list (once #4736 is merged)

@philbucher
Copy link
Member

is this really closed?
I think there are some things left to do

@jcotela
Copy link
Member Author

jcotela commented Jul 15, 2019

@philbucher kind of... @rubenzorrilla and me agreed not to rename the existing solvers for now. I believe that the only solver on that list that is really missing is the trilinos compressible solver, but there is no movement on that front at the moment

@philbucher
Copy link
Member

I meant #4390 and #4121 are not implemented yet

@jcotela
Copy link
Member Author

jcotela commented Jul 17, 2019

@philbucher I added them to the FluidDynamicsApplication project, I think it will be easier to keep track of them in this way.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants