Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add default printing for show(io, ::ModelLike) #2505

Merged
merged 20 commits into from
May 27, 2024
Merged

Add default printing for show(io, ::ModelLike) #2505

merged 20 commits into from
May 27, 2024

Conversation

odow
Copy link
Member

@odow odow commented May 24, 2024

Closes #2504

@odow

This comment was marked as outdated.

@odow
Copy link
Member Author

odow commented May 24, 2024

I think I like this:

julia> include("test/Bridges/sdpa_models.jl")

julia> model = MOI.Utilities.CachingOptimizer(
           MOI.Utilities.Model{Float64}(),
           MOI.Bridges.full_bridge_optimizer(StandardSDPAModel{Float64}(), Float64),
       )
A MOI.Utilities.CachingOptimizer:
├ state
│ └ EMPTY_OPTIMIZER
├ mode
│ └ AUTOMATIC
├ model_cache
│ An empty MOIU.Model{Float64}
└ optimizer
  A MOIB.LazyBridgeOptimizer{MOIU.GenericModel{Float64, MOIU.ObjectiveContainer{Float64}, MOIU.VariablesContainer{Float64}, StandardSDPAModelFunctionConstraints{Float64}}}
  ├ Variable bridges
  │ └ none
  ├ Constraint bridges
  │ └ none
  ├ Objective bridges
  │ └ none
  └ model
    An empty MOIU.GenericModel{Float64, MOIU.ObjectiveContainer{Float64}, MOIU.VariablesContainer{Float64}, StandardSDPAModelFunctionConstraints{Float64}}

julia> x = MOI.add_variable(model)
MOI.VariableIndex(1)

julia> MOI.add_constraint(model, 2.0 * x, MOI.LessThan(3.0))
MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.LessThan{Float64}}(1)

julia> MOI.Utilities.attach_optimizer(model)

julia> model
A MOI.Utilities.CachingOptimizer:
├ state
│ └ ATTACHED_OPTIMIZER
├ mode
│ └ AUTOMATIC
├ model_cache
│ A MOIU.Model{Float64}
│ ├ ObjectiveSense
│ │ └ FEASIBILITY_SENSE
│ ├ ObjectiveFunctionType
│ │ └ MOI.ScalarAffineFunction{Float64}
│ ├ NumberOfVariables
│ │ └ 1
│ └ NumberOfConstraints
│   └ MOI.ScalarAffineFunction{Float64} in MOI.LessThan{Float64}: 1
└ optimizer
  A MOIB.LazyBridgeOptimizer{MOIU.GenericModel{Float64, MOIU.ObjectiveContainer{Float64}, MOIU.VariablesContainer{Float64}, StandardSDPAModelFunctionConstraints{Float64}}}
  ├ Variable bridges
  │ ├ MOIB.Variable.FreeBridge{Float64}
  │ └ MOIB.Variable.VectorizeBridge{Float64, MOI.Nonnegatives}
  ├ Constraint bridges
  │ ├ MOIB.Constraint.LessToGreaterBridge{Float64, MOI.ScalarAffineFunction{Float64}, MOI.ScalarAffineFunction{Float64}}
  │ └ MOIB.Constraint.ScalarSlackBridge{Float64, MOI.ScalarAffineFunction{Float64}, MOI.GreaterThan{Float64}}
  ├ Objective bridges
  │ └ none
  └ model
    A MOIU.GenericModel{Float64, MOIU.ObjectiveContainer{Float64}, MOIU.VariablesContainer{Float64}, StandardSDPAModelFunctionConstraints{Float64}}
    ├ ObjectiveSense
    │ └ FEASIBILITY_SENSE
    ├ ObjectiveFunctionType
    │ └ MOI.ScalarAffineFunction{Float64}
    ├ NumberOfVariables
    │ └ 3
    └ NumberOfConstraints
      ├ MOI.ScalarAffineFunction{Float64} in MOI.EqualTo{Float64}: 1
      └ MOI.VectorOfVariables in MOI.Nonnegatives: 2

@odow odow changed the title WIP: add default summary for show(io, ::ModelLike) Add default printing for show(io, ::ModelLike) May 24, 2024
@odow
Copy link
Member Author

odow commented May 24, 2024

So this is actually super useful for debugging.

Cribbing from a recent post on discourse:

julia> using JuMP

julia> import HiGHS

julia> import MultiObjectiveAlgorithms as MOA

julia> model = Model(() -> MOA.Optimizer(HiGHS.Optimizer));

julia> set_silent(model)

julia> set_attribute(model, MOA.Algorithm(), MOA.Hierarchical())

julia> @variables(model, begin
           lb[i] <= z[i in 1:length(lb)] <= ub[i], Int
           d >= 0
       end);

julia> @constraints(model, begin
           z - lb .- d >= 0
           z - ub .+ d <= 0
           sum(z) <= 500
       end);

julia> @objective(model, Max, [sum(z), d]);

julia> optimize!(model)

julia> backend(model)
A MOI.Utilities.CachingOptimizer:
├ state
│ └ ATTACHED_OPTIMIZER
├ mode
│ └ AUTOMATIC
├ model_cache
│ A MOIU.UniversalFallback{MOIU.Model{Float64}}
│ ├ ObjectiveSense
│ │ └ MAX_SENSE
│ ├ ObjectiveFunctionType
│ │ └ MOI.VectorAffineFunction{Float64}
│ ├ NumberOfVariables
│ │ └ 11
│ └ NumberOfConstraints
│   ├ MOI.ScalarAffineFunction{Float64} in MOI.LessThan{Float64}: 1
│   ├ MOI.VectorAffineFunction{Float64} in MOI.Nonnegatives: 1
│   ├ MOI.VectorAffineFunction{Float64} in MOI.Nonpositives: 1
│   ├ MOI.VariableIndex in MOI.GreaterThan{Float64}: 11
│   ├ MOI.VariableIndex in MOI.LessThan{Float64}: 10
│   └ MOI.VariableIndex in MOI.Integer: 10
└ optimizer
  A MOIB.LazyBridgeOptimizer{MultiObjectiveAlgorithms.Optimizer}
  ├ Variable bridges
  │ └ none
  ├ Constraint bridges
  │ ├ MOIB.Constraint.ScalarizeBridge{Float64, MOI.ScalarAffineFunction{Float64}, MOI.GreaterThan{Float64}}
  │ └ MOIB.Constraint.ScalarizeBridge{Float64, MOI.ScalarAffineFunction{Float64}, MOI.LessThan{Float64}}
  ├ Objective bridges
  │ └ none
  └ model
    A MultiObjectiveAlgorithms.Optimizer
    ├ ObjectiveSense
    │ └ MAX_SENSE
    ├ ObjectiveFunctionType
    │ └ MOI.VectorAffineFunction{Float64}
    ├ NumberOfVariables
    │ └ 11
    └ NumberOfConstraints
      ├ MOI.ScalarAffineFunction{Float64} in MOI.GreaterThan{Float64}: 10
      ├ MOI.ScalarAffineFunction{Float64} in MOI.LessThan{Float64}: 11
      ├ MOI.VariableIndex in MOI.GreaterThan{Float64}: 11
      ├ MOI.VariableIndex in MOI.LessThan{Float64}: 10
      └ MOI.VariableIndex in MOI.Integer: 10

@ericphanson
Copy link
Contributor

ericphanson commented May 24, 2024

This looks nice! I wonder if it could be a bit more compact in cases when there's only 1 child? I've found too much scrolling in the REPL can make it harder to understand a printout. e.g.

A MOI.Utilities.CachingOptimizer:
├ state: ATTACHED_OPTIMIZER
├ mode: AUTOMATIC
├ model_cache: MOIU.UniversalFallback{MOIU.Model{Float64}}
│ ├ ObjectiveSense: MAX_SENSE
│ ├ ObjectiveFunctionType: MOI.VectorAffineFunction{Float64}
│ ├ NumberOfVariables: 11
│ └ NumberOfConstraints: 34
│   ├ MOI.ScalarAffineFunction{Float64} in MOI.LessThan{Float64}: 1
│   ├ MOI.VectorAffineFunction{Float64} in MOI.Nonnegatives: 1
│   ├ MOI.VectorAffineFunction{Float64} in MOI.Nonpositives: 1
│   ├ MOI.VariableIndex in MOI.GreaterThan{Float64}: 11
│   ├ MOI.VariableIndex in MOI.LessThan{Float64}: 10
│   └ MOI.VariableIndex in MOI.Integer: 10
└ optimizer: MOIB.LazyBridgeOptimizer{MultiObjectiveAlgorithms.Optimizer}
  ├ Variable bridges: none
  ├ Constraint bridges
  │ ├ MOIB.Constraint.ScalarizeBridge{Float64, MOI.ScalarAffineFunction{Float64}, MOI.GreaterThan{Float64}}
  │ └ MOIB.Constraint.ScalarizeBridge{Float64, MOI.ScalarAffineFunction{Float64}, MOI.LessThan{Float64}}
  ├ Objective bridges: none
  └ model: MultiObjectiveAlgorithms.Optimizer
    ├ ObjectiveSense: MAX_SENSE
    ├ ObjectiveFunctionType: MOI.VectorAffineFunction{Float64}
    ├ NumberOfVariables: 11
    └ NumberOfConstraints: 52
      ├ MOI.ScalarAffineFunction{Float64} in MOI.GreaterThan{Float64}: 10
      ├ MOI.ScalarAffineFunction{Float64} in MOI.LessThan{Float64}: 11
      ├ MOI.VariableIndex in MOI.GreaterThan{Float64}: 11
      ├ MOI.VariableIndex in MOI.LessThan{Float64}: 10
      └ MOI.VariableIndex in MOI.Integer: 10

This is kinda already done for contraints, like MOI.ScalarAffineFunction{Float64} in MOI.GreaterThan{Float64}: 10, so I think it could work also for NumberOfVariables, ObjectiveSense, ObjectiveFunctionType, state and mode.

I think it would also make sense for nones in things like Variable bridges, Constraint bridges.

I also tweaked it so instead of A MOIU.UniversalFallback{MOIU.Model{Float64}} on the following line, I removed the A and put it after a colon. To me the "A" was a bit confusing (I know it's the indefinite article, but the rest isn't really sentences so it's not clear if it means like some kind of "type A" vs "type B" or something). And IMO it reads a little better having it on the same line. I did the same for optimizer: MOIB.LazyBridgeOptimizer{MultiObjectiveAlgorithms.Optimizer} and model: MultiObjectiveAlgorithms.Optimizer.

Lastly I think it could also be nice to have NumberOfConstraints print a total, like I added here.

@odow
Copy link
Member Author

odow commented May 25, 2024

@ericphanson done

@odow
Copy link
Member Author

odow commented May 25, 2024

It's now:

julia> using JuMP

julia> import HiGHS

julia> import MultiObjectiveAlgorithms as MOA

julia> model = Model(() -> MOA.Optimizer(HiGHS.Optimizer));

julia> set_silent(model)

julia> set_attribute(model, MOA.Algorithm(), MOA.Hierarchical())

julia> lb = ones(10); ub = fill(10, 10);

julia> @variables(model, begin
           lb[i] <= z[i in 1:length(lb)] <= ub[i], Int
           d >= 0
       end);

julia> @constraints(model, begin
           z - lb .- d >= 0
           z - ub .+ d <= 0
           sum(z) <= 500
       end);

julia> @objective(model, Max, [sum(z), d]);

julia> optimize!(model)

julia> backend(model)
MOIU.CachingOptimizer
├ state: ATTACHED_OPTIMIZER
├ mode: AUTOMATIC
├ model_cache: MOIU.UniversalFallback{MOIU.Model{Float64}}
│ ├ ObjectiveSense: MAX_SENSE
│ ├ ObjectiveFunctionType: MOI.VectorAffineFunction{Float64}
│ ├ NumberOfVariables: 11
│ └ NumberOfConstraints: 34
│   ├ MOI.ScalarAffineFunction{Float64} in MOI.LessThan{Float64}: 1
│   ├ MOI.VectorAffineFunction{Float64} in MOI.Nonnegatives: 1
│   ├ MOI.VectorAffineFunction{Float64} in MOI.Nonpositives: 1
│   ├ MOI.VariableIndex in MOI.GreaterThan{Float64}: 11
│   ├ MOI.VariableIndex in MOI.LessThan{Float64}: 10
│   └ MOI.VariableIndex in MOI.Integer: 10
└ optimizer:   MOIB.LazyBridgeOptimizer{MultiObjectiveAlgorithms.Optimizer}
  ├ Variable bridges: none
  ├ Constraint bridges:
  │ ├ MOIB.Constraint.ScalarizeBridge{Float64, MOI.ScalarAffineFunction{Float64}, MOI.GreaterThan{Float64}}
  │ └ MOIB.Constraint.ScalarizeBridge{Float64, MOI.ScalarAffineFunction{Float64}, MOI.LessThan{Float64}}
  ├ Objective bridges: none
  └ model: MultiObjectiveAlgorithms.Optimizer
    ├ ObjectiveSense: MAX_SENSE
    ├ ObjectiveFunctionType: MOI.VectorAffineFunction{Float64}
    ├ NumberOfVariables: 11
    └ NumberOfConstraints: 52
      ├ MOI.ScalarAffineFunction{Float64} in MOI.GreaterThan{Float64}: 10
      ├ MOI.ScalarAffineFunction{Float64} in MOI.LessThan{Float64}: 11
      ├ MOI.VariableIndex in MOI.GreaterThan{Float64}: 11
      ├ MOI.VariableIndex in MOI.LessThan{Float64}: 10
      └ MOI.VariableIndex in MOI.Integer: 10

@odow
Copy link
Member Author

odow commented May 25, 2024

Here's a problem;

julia> using JuMP, SCS

julia> model = Model(SCS.Optimizer);

julia> set_silent(model)

julia> @variable(model, x[1:2, 1:2] >= 0, PSD);

julia> @constraint(model, sum(x) == 1);

julia> optimize!(model)

julia> backend(model)
MOIU.CachingOptimizer
├ state: ATTACHED_OPTIMIZER
├ mode: AUTOMATIC
├ model_cache: MOIU.UniversalFallback{MOIU.Model{Float64}}
│ ├ ObjectiveSense: FEASIBILITY_SENSE
│ ├ ObjectiveFunctionType: MOI.ScalarAffineFunction{Float64}
│ ├ NumberOfVariables: 3
│ └ NumberOfConstraints: 5
│   ├ MOI.ScalarAffineFunction{Float64} in MOI.EqualTo{Float64}: 1
│   ├ MOI.VectorOfVariables in MOI.PositiveSemidefiniteConeTriangle: 1
│   └ MOI.VariableIndex in MOI.GreaterThan{Float64}: 3
└ optimizer:   MOIB.LazyBridgeOptimizer{MOIU.CachingOptimizer{SCS.Optimizer, MOIU.UniversalFallback{MOIU.Model{Float64}}}}
  ├ Variable bridges: none
  ├ Constraint bridges:
  │ ├ MOIB.Constraint.SetDotScalingBridge{Float64, MOI.PositiveSemidefiniteConeTriangle, MOI.VectorAffineFunction{Float64}, MOI.VectorOfVariables}
  │ ├ MOIB.Constraint.VectorizeBridge{Float64, MOI.VectorAffineFunction{Float64}, MOI.Nonnegatives, MOI.VariableIndex}
  │ ├ MOIB.Constraint.VectorizeBridge{Float64, MOI.VectorAffineFunction{Float64}, MOI.Zeros, MOI.ScalarAffineFunction{Float64}}
  │ └ SCS.ScaledPSDConeBridge{Float64, MOI.VectorAffineFunction{Float64}}
  ├ Objective bridges: none
  └ model:     MOIU.CachingOptimizer
    ├ state: EMPTY_OPTIMIZER
    ├ mode: AUTOMATIC
    ├ model_cache: MOIU.UniversalFallback{MOIU.Model{Float64}}
    │ ├ ObjectiveSense: FEASIBILITY_SENSE
    │ ├ ObjectiveFunctionType: MOI.ScalarAffineFunction{Float64}
    │ ├ NumberOfVariables: 3
    │ └ NumberOfConstraints: 5
    │   ├ MOI.VectorAffineFunction{Float64} in MOI.Zeros: 1
    │   ├ MOI.VectorAffineFunction{Float64} in MOI.Nonnegatives: 3
    │   └ MOI.VectorAffineFunction{Float64} in SCS.ScaledPSDCone: 1
    └ optimizer: SCS.Optimizer
      ├ ObjectiveSense: FEASIBILITY_SENSE
      ├ ObjectiveFunctionType: ?
      ├ NumberOfVariables: ?
      └ NumberOfConstraints: 0

SCS doesn't support getting various attributes, so there is some weirdness. We could fix that though.

@blegat
Copy link
Member

blegat commented May 25, 2024

This is very nice, I like it ! It makes the MOI layers much less obscure

@odow
Copy link
Member Author

odow commented May 25, 2024

Now it's

julia> backend(model)
MOIU.CachingOptimizer
├ state: ATTACHED_OPTIMIZER
├ mode: AUTOMATIC
├ model_cache: MOIU.UniversalFallback{MOIU.Model{Float64}}
│ ├ ObjectiveSense: FEASIBILITY_SENSE
│ ├ ObjectiveFunctionType: MOI.ScalarAffineFunction{Float64}
│ ├ NumberOfVariables: 3
│ └ NumberOfConstraints: 5
│   ├ MOI.ScalarAffineFunction{Float64} in MOI.EqualTo{Float64}: 1
│   ├ MOI.VectorOfVariables in MOI.PositiveSemidefiniteConeTriangle: 1
│   └ MOI.VariableIndex in MOI.GreaterThan{Float64}: 3
└ optimizer: MOIB.LazyBridgeOptimizer{MOIU.CachingOptimizer{SCS.Optimizer, MOIU.UniversalFallback{MOIU.Model{Float64}}}}
  ├ Variable bridges: none
  ├ Constraint bridges:
  │ ├ MOIB.Constraint.SetDotScalingBridge{Float64, MOI.PositiveSemidefiniteConeTriangle, MOI.VectorAffineFunction{Float64}, MOI.VectorOfVariables}
  │ ├ MOIB.Constraint.VectorizeBridge{Float64, MOI.VectorAffineFunction{Float64}, MOI.Nonnegatives, MOI.VariableIndex}
  │ ├ MOIB.Constraint.VectorizeBridge{Float64, MOI.VectorAffineFunction{Float64}, MOI.Zeros, MOI.ScalarAffineFunction{Float64}}
  │ └ SCS.ScaledPSDConeBridge{Float64, MOI.VectorAffineFunction{Float64}}
  ├ Objective bridges: none
  └ model: MOIU.CachingOptimizer
    ├ state: EMPTY_OPTIMIZER
    ├ mode: AUTOMATIC
    ├ model_cache: MOIU.UniversalFallback{MOIU.Model{Float64}}
    │ ├ ObjectiveSense: FEASIBILITY_SENSE
    │ ├ ObjectiveFunctionType: MOI.ScalarAffineFunction{Float64}
    │ ├ NumberOfVariables: 3
    │ └ NumberOfConstraints: 5
    │   ├ MOI.VectorAffineFunction{Float64} in MOI.Zeros: 1
    │   ├ MOI.VectorAffineFunction{Float64} in MOI.Nonnegatives: 3
    │   └ MOI.VectorAffineFunction{Float64} in SCS.ScaledPSDCone: 1
    └ optimizer: SCS.Optimizer
      ├ ObjectiveSense: unknown
      ├ ObjectiveFunctionType: unknown
      ├ NumberOfVariables: unknown
      └ NumberOfConstraints: unknown

@ericphanson
Copy link
Contributor

looks great!

@odow odow requested a review from blegat May 25, 2024 21:45
src/MathOptInterface.jl Outdated Show resolved Hide resolved
@odow
Copy link
Member Author

odow commented May 26, 2024

Thoughts @blegat?

@blegat
Copy link
Member

blegat commented May 27, 2024

Maybe we should add a test printing an empty model that every solver runs. It should be safe with the _try_catch but you never know, a solver could also segfault ^^


function test_model_show(model::MOI.ModelLike, ::Config{T}) where {T}
# We don't enforce any particular output.
@test sprint(show, model) isa String
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@blegat see this test

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@odow odow merged commit 902064f into master May 27, 2024
16 checks passed
@odow odow deleted the od/summary branch May 27, 2024 08:56
@odow
Copy link
Member Author

odow commented May 27, 2024

I think this is a really nice quality of life improvement. Don't know why we didn't do this earlier.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

Add function to summarize model "size"?
3 participants