-
Notifications
You must be signed in to change notification settings - Fork 93
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Auto construction of a network level reasoner #253
Auto construction of a network level reasoner #253
Conversation
Merge from master
Merge from master
Merge from master
Merge from master
Symbolic bound tightening in NLR (NeuralNetworkVerification#249)
Symbolic bound tightening over absolute values (NeuralNetworkVerification#251)
Merge from master
if ( eq._type != Equation::EQ ) | ||
continue; | ||
|
||
Set<unsigned> eqVars = eq.getParticipatingVariables(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Storing this set across iterations might be more efficient, than repeatedly attempting to reduce it to one. Even though this will probably be a rare case.
This would also address my next comment because it would be sufficient to eliminate just processed variables since the effects would accumulate across layer iterations.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I didn't understand this comment. What does it mean about the topology if an equation's variables appear across multiple layers?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe I made an assumption that's incorrect:
I assumed that the equation is a weighted sum of inputs = backwards-facing-variable, so if some variables skip levels then there is no guarantee that all members of the weighted sum belong to the same level, so they may be discharged at different times. So what I was suggesting was a mapping {vars} -> backward-facing-variable, and through iterations we eliminate variables from the set, and when the variable set is empty we know the position of the activation function and can remove the equation from consideration.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The algorithm LGTM, I've added some comments that should include some corner cases, and improve efficiency a bit. See what you think is worth addressing, and I can take another look.
Hi Aleks, |
Merge from master
Merge from master
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
src/engine/NetworkLevelReasoner.cpp
Outdated
if ( bias > 0 ) | ||
printf( " + %.2lf", bias ); | ||
else | ||
printf( " - %.2lf", -bias ); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This can also be compacted by moving sign into the formatter.
…ion#253) * types for PL constraints * first attempt at automatically constructing the NLR * first unit test, couple of bug fixes * use PiecewiseLinearFunctionType * cleanup * some cleanup, and store more information about discovered neurons * wip * complete the construction of the NLR * bug fixes * bug fix * dumping functionality * bug fix * test * changes per Aleks' comments * minor * python bindings * minor Co-authored-by: Guy Katz <guykatz@cs.huji.ac.il>
* types for PL constraints * first attempt at automatically constructing the NLR * first unit test, couple of bug fixes * use PiecewiseLinearFunctionType * cleanup * some cleanup, and store more information about discovered neurons * wip * complete the construction of the NLR * bug fixes * bug fix * dumping functionality * bug fix * test * changes per Aleks' comments * minor * python bindings * minor Co-authored-by: Guy Katz <guykatz@cs.huji.ac.il>
When a user has not provided a network level reasoner as part of the input query, Marabou will attempt to automatically figure out the topology of the network and construct one itself. This is needed especially for symbolic bound tightening.