Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

specialize calc_boundary_flux! for nonconservative terms for DGMulti #1431

Merged
merged 42 commits into from
May 14, 2023

Conversation

jlchan
Copy link
Contributor

@jlchan jlchan commented May 8, 2023

This PR enables for IdealGlmMhdEquations2D the following: BoundaryConditionDirichlet, BoundaryConditionDoNothing, and a custom slip wall type boundary condition which is defined within the elixir examples/dgmulti_2d/elixir_mhd_reflective_BCs.jl.

IMO there are a few design issues still (see my initial review below).

@ranocha
Copy link
Member

ranocha commented May 8, 2023

Feel free to request a review from me when you're ready 🙂 @andrewwinters5000 should also have a look, I think, since he is our nonconservative specialist

@jlchan jlchan marked this pull request as draft May 8, 2023 15:23
@jlchan
Copy link
Contributor Author

jlchan commented May 8, 2023

Thanks! Still running into some bugs at the moment.

@codecov
Copy link

codecov bot commented May 8, 2023

Codecov Report

Merging #1431 (c9c272e) into main (6df64e9) will increase coverage by 69.05%.
The diff coverage is 83.08%.

❗ Current head c9c272e differs from pull request most recent head 091c5b5. Consider uploading reports for the commit 091c5b5 to get more accurate results

@@             Coverage Diff             @@
##             main    #1431       +/-   ##
===========================================
+ Coverage   25.73%   94.77%   +69.05%     
===========================================
  Files         357      358        +1     
  Lines       29408    29643      +235     
===========================================
+ Hits         7566    28094    +20528     
+ Misses      21842     1549    -20293     
Flag Coverage Δ
unittests 94.77% <83.08%> (+69.05%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
src/equations/equations.jl 90.20% <0.00%> (+5.09%) ⬆️
src/solvers/dgmulti/flux_differencing_gauss_sbp.jl 0.00% <0.00%> (ø)
src/solvers/dgsem_tree/dg_2d.jl 96.49% <ø> (+26.45%) ⬆️
src/basic_types.jl 80.00% <33.33%> (-20.00%) ⬇️
examples/dgmulti_2d/elixir_mhd_reflective_BCs.jl 100.00% <100.00%> (ø)
src/callbacks_step/glm_speed_dg.jl 100.00% <100.00%> (+100.00%) ⬆️
src/solvers/dgmulti/dg.jl 94.20% <100.00%> (+29.78%) ⬆️
src/solvers/dgmulti/flux_differencing.jl 87.41% <100.00%> (+53.24%) ⬆️

... and 305 files with indirect coverage changes

@jlchan jlchan changed the title add nonconservative boundary conditions to DGMulti add nonconservative MHD boundary conditions to DGMulti May 8, 2023
src/basic_types.jl Outdated Show resolved Hide resolved
src/equations/ideal_glm_mhd_2d.jl Outdated Show resolved Hide resolved
@jlchan jlchan marked this pull request as ready for review May 8, 2023 18:16
@sloede
Copy link
Member

sloede commented May 8, 2023

How can I test this? Right now I get

ERROR: LoadError: UndefVarError: dot not defined
Stacktrace:
     [1] boundary_condition_velocity_slip_wall(u_inner::SVector{9, Float64}, normal_direction::SVector{2, Float64}, x::SVector{2, Float64}, t::Float64, surface_flux_function::FluxLaxFriedrichs{typeof(max_abs_speed_naive)}, equations::IdealGlmMhdEquations2D{Float64})
       @ Main /mnt/ssd/home/mschlott/hackathon/jc-Trixi.jl/examples/dgmulti_2d/elixir_mhd_reflective_BCs.jl:60
     [2] calc_single_boundary_flux!(cache::NamedTuple{(:md, :Qrst_skew, :dxidxhatj, :invJ, :lift_scalings, :inv_wq, :u_values, :u_face_values, :flux_face_values, :local_values_threaded, :fluxdiff_local_threaded), Tuple{StartUpDG.MeshData{2, StartUpDG.VertexMappedMesh{Quad, Tuple{Vector{Float64}, Vector{Float64}}, …}, …}, Tuple{LinearAlgebra.Adjoint{Float64, SparseArrays.SparseMatrixCSC{Float64, Int64}}, LinearAlgebra.Adjoint{Float64, SparseArrays.SparseMatrixCSC{Float64, Int64}}}, …}}, t::Float64, boundary_condition::typeof(boundary_condition_velocity_slip_wall), boundary_key::Symbol, have_nonconservative_terms::Static.True, mesh::DGMultiMesh{2, Trixi.Affine, …}, equations::IdealGlmMhdEquations2D{Float64}, dg::DGMulti{2, Quad, SBP{StartUpDG.TensorProductLobatto}, SurfaceIntegralWeakForm{Tuple{FluxLaxFriedrichs{typeof(max_abs_speed_naive)}, typeof(flux_nonconservative_powell)}}, VolumeIntegralFluxDifferencing{Tuple{typeof(flux_hindenlang_gassner), typeof(flux_nonconservative_powell)}}, Nothing, StartUpDG.RefElemData{2, Quad, …}})
       @ Trixi /mnt/ssd/home/mschlott/hackathon/jc-Trixi.jl/src/solvers/dgmulti/dg.jl:526
     [3] calc_boundary_flux!
       @ /mnt/ssd/home/mschlott/hackathon/jc-Trixi.jl/src/solvers/dgmulti/dg.jl:438 [inlined]
     [4] macro expansion
       @ /mnt/ssd/home/mschlott/hackathon/jc-Trixi.jl/src/auxiliary/auxiliary.jl:272 [inlined]
     [5] rhs!(du::Matrix{SVector{9, Float64}}, u::Matrix{SVector{9, Float64}}, t::Float64, mesh::DGMultiMesh{2, Trixi.Affine, …}, equations::IdealGlmMhdEquations2D{Float64}, initial_condition::Function, boundary_conditions::NamedTuple{(:x_neg, :x_pos, :y_neg, :y_pos), Tuple{typeof(boundary_condition_velocity_slip_wall), typeof(boundary_condition_velocity_slip_wall), …}}, source_terms::Nothing, dg::DGMulti{2, Quad, SBP{StartUpDG.TensorProductLobatto}, SurfaceIntegralWeakForm{Tuple{FluxLaxFriedrichs{typeof(max_abs_speed_naive)}, typeof(flux_nonconservative_powell)}}, VolumeIntegralFluxDifferencing{Tuple{typeof(flux_hindenlang_gassner), typeof(flux_nonconservative_powell)}}, Nothing, StartUpDG.RefElemData{2, Quad, …}}, cache::NamedTuple{(:md, :Qrst_skew, :dxidxhatj, :invJ, :lift_scalings, :inv_wq, :u_values, :u_face_values, :flux_face_values, :local_values_threaded, :fluxdiff_local_threaded), Tuple{StartUpDG.MeshData{2, StartUpDG.VertexMappedMesh{Quad, Tuple{Vector{Float64}, Vector{Float64}}, …}, …}, Tuple{LinearAlgebra.Adjoint{Float64, SparseArrays.SparseMatrixCSC{Float64, Int64}}, LinearAlgebra.Adjoint{Float64, SparseArrays.SparseMatrixCSC{Float64, Int64}}}, …}})
       @ Trixi /mnt/ssd/home/mschlott/hackathon/jc-Trixi.jl/src/solvers/dgmulti/flux_differencing.jl:633
     [6] macro expansion
       @ /mnt/ssd/home/mschlott/hackathon/jc-Trixi.jl/src/auxiliary/auxiliary.jl:272 [inlined]
     [7] rhs!(du_ode::Matrix{SVector{9, Float64}}, u_ode::Matrix{SVector{9, Float64}}, semi::SemidiscretizationHyperbolic{DGMultiMesh{2, Trixi.Affine, …}, IdealGlmMhdEquations2D{Float64}, …}, t::Float64)
       @ Trixi /mnt/ssd/home/mschlott/hackathon/jc-Trixi.jl/src/semidiscretization/semidiscretization_hyperbolic.jl:297
  [8-20] ⋮ internal
       @ FunctionWrappers, OrdinaryDiffEq, DiffEqBase, Unknown
    [21] solve(prob::ODEProblem{Matrix{SVector{9, Float64}}, Tuple{Float64, Float64}, …}, args::CarpenterKennedy2N54{typeof(OrdinaryDiffEq.trivial_limiter!), typeof(OrdinaryDiffEq.trivial_limiter!), …}; sensealg::Nothing, u0::Nothing, p::Nothing, wrap::Val{true}, kwargs::Base.Pairs{Symbol, Any, …})
       @ DiffEqBase ~/.julia/packages/DiffEqBase/ZXMKG/src/solve.jl:881
    [22] top-level scope
       @ /mnt/ssd/home/mschlott/hackathon/jc-Trixi.jl/examples/dgmulti_2d/elixir_mhd_reflective_BCs.jl:98
    [23] include
       @ ./Base.jl:420 [inlined]
    [24] trixi_include(mod::Module, elixir::String; kwargs::Base.Pairs{Symbol, Union{}, …})
       @ Trixi /mnt/ssd/home/mschlott/hackathon/jc-Trixi.jl/src/auxiliary/special_elixirs.jl:43
    [25] trixi_include
       @ /mnt/ssd/home/mschlott/hackathon/jc-Trixi.jl/src/auxiliary/special_elixirs.jl:38 [inlined]
    [26] #trixi_include#1297
       @ /mnt/ssd/home/mschlott/hackathon/jc-Trixi.jl/src/auxiliary/special_elixirs.jl:46 [inlined]
    [27] trixi_include(elixir::String)
       @ Trixi /mnt/ssd/home/mschlott/hackathon/jc-Trixi.jl/src/auxiliary/special_elixirs.jl:46
Use `err` to retrieve the full stack trace.
in expression starting at /mnt/ssd/home/mschlott/hackathon/jc-Trixi.jl/examples/dgmulti_2d/elixir_mhd_reflective_BCs.jl:98

when trying to run the examples/dgmulti_2d/elixir_mhd_reflective_BCs.jl elixir.

@jlchan
Copy link
Contributor Author

jlchan commented May 8, 2023

@sloede sorry, I forgot to import some routines from LinearAlgebra.jl. They are loaded by default in my startup.jl file, so the error didn't show up locally.

Copy link
Member

@sloede sloede left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With these two changes, I get the allocations in

@time Trixi.calc_single_boundary_flux!(semi.cache, t, first(boundary_conditions), first(keys(boundary_conditions)), Trixi.have_nonconservative_terms(equations), mesh, equations, semi.solver)

from

0.000205 seconds (1.40 k allocations: 70.562 KiB)

down to

0.000038 seconds (6 allocations: 2.672 KiB)

Thus there's certainly some funny business still going on, but less than there was before...

src/solvers/dgmulti/dg.jl Outdated Show resolved Hide resolved
src/solvers/dgmulti/dg.jl Outdated Show resolved Hide resolved
@jlchan jlchan requested a review from ranocha May 10, 2023 14:31
@jlchan
Copy link
Contributor Author

jlchan commented May 10, 2023

Filed an issue for implementing BCs on non-conservative terms. #1445

@jlchan jlchan closed this May 10, 2023
@jlchan jlchan reopened this May 10, 2023
Copy link
Member

@ranocha ranocha left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! Should be fine, I think. It would be great if @andrewwinters5000 could have a look, too

ranocha
ranocha previously approved these changes May 10, 2023
Copy link
Member

@ranocha ranocha left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! Should be fine, I think. It would be great if @andrewwinters5000 could have a look, too

@jlchan
Copy link
Contributor Author

jlchan commented May 11, 2023

@ranocha @sloede is the failure of MPI on Ubuntu spurious? The stacktrace is

WARNING: replacing module TrixiTestModule.
[ Info: Testset elixir_advection_amr.jl finished in 87.166008962 seconds.
WARNING: replacing module TrixiTestModule.
════════════════════════════════════════════════════════════════════════════════════════════════════
/home/runner/work/Trixi.jl/Trixi.jl/examples/p4est_3d_dgsem/elixir_advection_amr_unstructured_curved.jl

signal (15): Terminated
in expression starting at /home/runner/work/Trixi.jl/Trixi.jl/test/runtests.jl:9

signal (15): Terminated
in expression starting at /home/runner/work/_actions/julia-actions/julia-runtest/v1/test_harness.jl:7
epoll_wait at /lib/x86_64-linux-gnu/libc.so.6 (unknown line)
epoll_wait at /lib/x86_64-linux-gnu/libc.so.6 (unknown line)

===================================================================================
=   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
=   PID 3368 RUNNING AT fv-az446-880
=   EXIT CODE: 9
=   CLEANING UP REMAINING PROCESSES
=   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
===================================================================================
YOUR APPLICATION TERMINATED WITH THE EXIT STRING: Killed (signal 9)
This typically refers to a problem with your application.
Please see the FAQ page for debugging suggestions
Error: The operation was canceled.

@sloede
Copy link
Member

sloede commented May 11, 2023

@ranocha @sloede is the failure of MPI on Ubuntu spurious? The stacktrace is

WARNING: replacing module TrixiTestModule.
[ Info: Testset elixir_advection_amr.jl finished in 87.166008962 seconds.
WARNING: replacing module TrixiTestModule.
════════════════════════════════════════════════════════════════════════════════════════════════════
/home/runner/work/Trixi.jl/Trixi.jl/examples/p4est_3d_dgsem/elixir_advection_amr_unstructured_curved.jl

signal (15): Terminated
in expression starting at /home/runner/work/Trixi.jl/Trixi.jl/test/runtests.jl:9

signal (15): Terminated
in expression starting at /home/runner/work/_actions/julia-actions/julia-runtest/v1/test_harness.jl:7
epoll_wait at /lib/x86_64-linux-gnu/libc.so.6 (unknown line)
epoll_wait at /lib/x86_64-linux-gnu/libc.so.6 (unknown line)

===================================================================================
=   BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
=   PID 3368 RUNNING AT fv-az446-880
=   EXIT CODE: 9
=   CLEANING UP REMAINING PROCESSES
=   YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
===================================================================================
YOUR APPLICATION TERMINATED WITH THE EXIT STRING: Killed (signal 9)
This typically refers to a problem with your application.
Please see the FAQ page for debugging suggestions
Error: The operation was canceled.

No, it was already noted by @ranocha here: #1316 (comment)

This seems to be either a CI bug or an upstream bug.

@jlchan
Copy link
Contributor Author

jlchan commented May 11, 2023

Thanks! Is it OK to merge this then?

@sloede sloede closed this May 11, 2023
@sloede sloede reopened this May 11, 2023
@sloede
Copy link
Member

sloede commented May 11, 2023

I'd like to see coverage... let's give it one more try (maybe issues on https://www.githubstatus.com/ might b related?)

@jlchan jlchan changed the title add nonconservative MHD boundary conditions to DGMulti specialize calc_boundary_flux! for nonconservative terms for DGMulti May 11, 2023
Copy link
Member

@ranocha ranocha left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@andrewwinters5000 Do you have some time for a review?

src/basic_types.jl Outdated Show resolved Hide resolved
src/equations/equations.jl Outdated Show resolved Hide resolved
Copy link
Member

@andrewwinters5000 andrewwinters5000 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall everything looks fine. I just left a few comments specifically regarding the (perhaps unused) BoundaryConditionDoNothing and BoundaryConditionDirichlet

examples/dgmulti_2d/elixir_mhd_reflective_BCs.jl Outdated Show resolved Hide resolved
src/basic_types.jl Outdated Show resolved Hide resolved
src/equations/equations.jl Outdated Show resolved Hide resolved
src/solvers/dgmulti/dg.jl Show resolved Hide resolved
Copy link
Member

@andrewwinters5000 andrewwinters5000 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Everything looks good from my end.

@jlchan
Copy link
Contributor Author

jlchan commented May 13, 2023

@ranocha @sloede the checks have passed. would this be OK to merge?

@sloede
Copy link
Member

sloede commented May 14, 2023

I'm ok with merging this content-wise, but is out coverage system broken? I haven't looked at it for a while, but now I see that we are <95% (we were at 97%, weren't we), but when looking at e.g. coveralls, everything seems to be untested in one particular file - that can't be right, can it?
https://coveralls.io/builds/59916995/source?filename=src%2Fsolvers%2Fdgmulti%2Fflux_differencing_gauss_sbp.jl#L64

@ranocha
Copy link
Member

ranocha commented May 14, 2023

CI has been broken for some reasons and I'm not sure whether we can trust it here. I propose to merge this now and check the coverage on main.

@sloede sloede merged commit 8d8619c into trixi-framework:main May 14, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants