Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Clean up maximization of minimum tutorial #328

Merged
merged 3 commits into from
Dec 6, 2023
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -1,35 +1,54 @@
# # Extracting minimizers
# # Maximizing as minimum

#md # [![](https://mybinder.org/badge_logo.svg)](@__BINDER_ROOT_URL__/generated/Polynomial Optimization/extracting_minimizers.ipynb)
#md # [![](https://img.shields.io/badge/show-nbviewer-579ACA.svg)](@__NBVIEWER_ROOT_URL__/generated/Polynomial Optimization/extracting_minimizers.ipynb)
# **Adapted from**: Example 6.23 of [L09]
#md # [![](https://mybinder.org/badge_logo.svg)](@__BINDER_ROOT_URL__/generated/Polynomial Optimization/min_univariate.ipynb)
#md # [![](https://img.shields.io/badge/show-nbviewer-579ACA.svg)](@__NBVIEWER_ROOT_URL__/generated/Polynomial Optimization/min_univariate.ipynb)
# **Adapted from**: Section 4.10 of [F99], Example 6.23 of [L09] and Table 5.1 of [Las09]
#
# [F99] Floudas, Christodoulos A. et al.
# *Handbook of Test Problems in Local and Global Optimization.*
# Nonconvex Optimization and Its Applications (NOIA, volume 33).
#
# [L09] Laurent, Monique.
# *Sums of squares, moment matrices and optimization over polynomials.*
# Emerging applications of algebraic geometry (2009): 157-270.
#
# [Las09] Lasserre, J. B.
# *Moments, positive polynomials and their applications*
# World Scientific, **2009**.

# ## Introduction

# Consider the polynomial optimization problem [L09, Example 6.23] of
# minimizing the linear function $-x_1 - x_2$
# Consider the polynomial optimization problem [F99, Section 4.10]
# of minimizing the linear function $-x_1 - x_2$
# over the basic semialgebraic set defined by the inequalities
# $x_2 \le 2x_1^4 - 8x_1^3 + 8x_1^2 + 2$,
# $x_2 \le 4x_1^4 - 32x_1^3 + 88x_1^2 - 96x_1 + 36$ and the box constraints
# $0 \le x_1 \le 3$ and $0 \le x_2 \le 4$,
# World Scientific, **2009**.

using Test #src
using DynamicPolynomials
@polyvar x[1:2]
p = -sum(x)
using SumOfSquares
K = @set x[1] >= 0 && x[1] <= 3 && x[2] >= 0 && x[2] <= 4 && x[2] <= 2x[1]^4 - 8x[1]^3 + 8x[1]^2 + 2 && x[2] <= 4x[1]^4 - 32x[1]^3 + 88x[1]^2 - 96x[1] + 36
f1 = 2x[1]^4 - 8x[1]^3 + 8x[1]^2 + 2
f2 = 4x[1]^4 - 32x[1]^3 + 88x[1]^2 - 96x[1] + 36
K = @set x[1] >= 0 && x[1] <= 3 && x[2] >= 0 && x[2] <= 4 && x[2] <= f1 && x[2] <= f2

# As we can observe below, the bounds on `x[2]` could be dropped and
# optimization problem is equivalent to the maximization of `min(f1, f2)`
# between `0` and `3`.

xs = range(0, stop = 3, length = 100)
using Plots
plot(xs, f1.(xs), label = "f1")
plot!(xs, f2.(xs), label = "f2")
plot!(xs, 4 * ones(length(xs)), label = nothing)

# We will now see how to find the optimal solution using Sum of Squares Programming.
# We first need to pick an SDP solver, see [here](https://jump.dev/JuMP.jl/v1.12/installation/#Supported-solvers) for a list of the available choices.

import CSDP
solver = optimizer_with_attributes(CSDP.Optimizer, MOI.Silent() => true)
import Clarabel
solver = Clarabel.Optimizer

# A Sum-of-Squares certificate that $p \ge \alpha$ over the domain `S`, ensures that $\alpha$ is a lower bound to the polynomial optimization problem.
# The following function searches for the largest lower bound and finds zero using the `d`th level of the hierarchy`.
Expand Down Expand Up @@ -74,3 +93,7 @@ model7 = solve(7)
x_opt = η.atoms[1].center
@test x_opt ≈ [2.3295, 3.1785] rtol=1e-4 #src
p(x_opt)

# We can see visualize the solution as follows:

scatter!([x_opt[1]], [x_opt[2]], markershape = :star, label = nothing)
Loading