-
Notifications
You must be signed in to change notification settings - Fork 135
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix GradientDescent Opt to not require FiniteDiff type #2177
Conversation
@wangcj05 It looks like SimulatedAnnealing was removed in the patch. I'm not familiar with SA, is this just a type of GradientDescent? Or should I add that type back in? |
sorry Daylan. I added a patch that you can apply. SimulatedAnnealing is supposed to be In the validation dictionary as well. |
@dylanjm also doc/workshop/optimizer/Inputs/1_grad_desc.xml should be updated. Sorry I missed this. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code changes have been reviewed and are in accordance to standards.
Only waiting on tests to pass before merging. If tests pass without further changes, feel free to merge. |
Pull Request Description
What issue does this change request address? (Use "#" before the issue to link it, i.e., #42.)
#2175
What are the significant changes in functionality due to this change request?
This PR removes the requirement for GradientDescent optimizer to be of
type="FiniteDifference"
For Change Control Board: Change Request Review
The following review must be completed by an authorized member of the Change Control Board.
<internalParallel>
to True.raven/tests/framework/user_guide
andraven/docs/workshop
) have been changed, the associated documentation must be reviewed and assured the text matches the example.