-
-
Notifications
You must be signed in to change notification settings - Fork 396
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Nonlinear subexpressions #3738
Comments
@variable(model, y)
@constraint(model, y :== f(x)) |
What is stopping |
Yes, it's a breaking change. We discussed this on the monthly developer call. @mlubin is in favor of looking into a |
Some related discussion from jump-dev/MathOptInterface.jl#2488: Some possible options:
I don't have a strong opinion yet. But our current approach requires us to smack the modeler on the hand and tell them they're holding it wrong and that they should do (2): #3729 (comment) |
I started looking at this again. I do wonder if the JuMP-level @expression(model, y[a in 1:3], sin(x[a]), Subexpression())
# lowers to
@variable(model, y[a in 1:3])
@constraint(model, [a in 1:3], y[a] == sin(x[a])) isn't the right way to go about this. We could add an additional bridge from This is super explicit. And it works for linear/conic/nonlinear. |
Now you have to teach users about the |
I'm actually dealing with this now in the MPSGE package. I use long (>1000
terms) non linear expressions multiple times in the same model. I am
solving this exactly as Oscar describes, adding a "fake" variable and
constraining the equation to equal the variable. In my particular package
this is all hidden from the user.
I'd be willing to test if there is a performance increase.
Would this work for anonymous variables?
…On Tue, Nov 12, 2024 at 10:46 AM Robert Parker ***@***.***> wrote:
Now you have to teach users about the Subexpression() option. This seems
like about the same amount of work as just using the variable and
constraint. Personally, I prefer option 1 above.
—
Reply to this email directly, view it on GitHub
<#3738 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AEVAPXMKCSMKQHA237UOXMD2AIWFTAVCNFSM6AAAAABRS74Z7GVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINZRGA2DKNBUGA>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
My current approach won't be a performance increase. It's just a syntactic sugar to make the fake variable and Perhaps we just need documentation for this. |
I added some docs here: #3879 |
Background
@NLexpression
.MOI.Nonlinear
has support for representing expressions, and dealing with them efficientlyScalarNonlinearFunction
, we did not create an analogousScalarNonlinearExpression
object.@NLexpression
, or they made every possible expression a@NLexpression
The question is how to achieve this in JuMP.
I opened an issue in MOI: jump-dev/MathOptInterface.jl#2488
Short of rewriting much of the
MOI.Nonlinear
module to use a single DAG of expressions (instead of the current tree), we could pass the expressions through to MOI, and then attempt to detect common subexpressions. This would rely on a heuristic of when it was beneficial to do so.A simpler approach would be to add a "nonlinear expression" set to MOI (and JuMP), just like we've done for Parameter.
A crude API would be:
with the fallback to a bridge
and maybe one for MCP:
We could come up with nicer syntax, for example:
The text was updated successfully, but these errors were encountered: