note: code based on cvxpy package
my notes of optimization:
Linear Search Methods:
Steepest Descent Method
Newton Method
Quasi-Newton Method
Conjugate Gradient Method
Matrix Util Method
Large-Scale Unconstrained Optimization:
Inexact Newton method
StepLength:
{ Backtracking Line Search } Algorithm: BacktrackingLineSearch.py
{ Interpolation: Quadratic; Cubic} Algorithm: Interpolation.py
{ Zoom} Algorithm: Zoom.py
{ Wolfe Line Search-low dimension} Algorithm: WolfeLineSearch.py
{ Wolfe Line Search-high dimension} Algorithm: WolfeCondition.py
Steepest Descent:
{ Gradient Descent Method } Algorithm: GradientDescentMethod.py
Newton:
{ Newton Method } Algorithm: NewtonMethod.py
{ Cholesky with Added Multiple of the Identity } Algorithm: AddedMultipleOfTheIdentity.py
Quasi-Newton:
{ DFP Method } Algorithm: DFP.py
{ BFGS Method } Algorithm: BFGS.py
Conjugate Gradient:
{ Conjugate Gradient Preliminary } Algorithm: CG_Preliminary.py
{ Conjugate Gradient } Algorithm: CG.py
{ Preconditioned Conjugate Gradient } Algorithm: Preconditioned_CG.py
{ Fletcher-Reeves methods } Algorithm: FR.py
MatrixUtil:
{ Cholesky Factorization: LDL^T} Algorithm: Cholesky_LDL.py
Inexact Newton method
{ Line Search Newton-CG } Algorithm: LineSearchNewton_CG.py
{ Limit memory-BFGS } Algorithm: L_BFGS.py
using CvxOpt or Cvxpy package demo:
{ CvxOpt Solve LP } Demo: CvxOptSolveLPDemo.py
{ Cvxpy Solve LP } Demo: CvxpySolveLPDemo.py
{ Cvxpy Solve NLP } Demo: CvxpySolveNLPDemo.py
Jorge Nocedal and Stephen J.Wright :
Numerical optimization
Second Edition
Stephen Boyd and Lieven vandenberghe:
Convex optimization