Neural networks for multivariable function approximation & classification.
- Neural networks are universal function approximators, which means that given enough parameters, a neural net can approximate any multivariable continuous function to any desired level of accuracy.
- The hybrid optimization algorithms used in the pack are specially designed for scientific computing tasks, outperforming several state-of-the-art first-order methods such as ADAM.
- More Details in MATLAB File Exchange : Surface Fitting using Neural Networks
"GeneralGuide.mlx" provides a general workflow and detailed instructions on configuring the solver. If you are unfamiliar with numerical optimization or deep learning, you can use the following command to configure the solver automatically.
% Network Structure Set Up, for regression
LayerStruct=[InputDimension,10,10,10,OutputDimension];
NN=Initialization(LayerStruct);
% Solver Set Up
option.MaxIteration=500;
NN=OptimizationSolver(data,label,NN,option);
"DigitRecognition.mlx" utilizes a simple MLP architecture and achieves an accuracy of 97.6% on the testing set of the "MNIST" handwritten digit recognition dataset.
% Network Structure Set Up, for classification
NN.Cost='Entropy';
LayerStruct=[InputDimension,128,64,16,OutputDimension];
NN=Initialization(LayerStruct,NN);
% Solver Set Up
option.Solver='ADAM'; option.s0=1e-3; % step size option.BatchSize=512;
option.MaxIteration=30; % number of epoch
NN=OptimizationSolver(data,label,NN,option);
- Ordinary MultiLayer Perceptron
- Residual Neural Network
- Stochastic Gradient Descents (SGD)
- Stochastic Gradient Descents with Momentum (SGDM)
- Adaptive Momentum Estimation (ADAM)
- Adaptive Momentum Estimation with Weight Decay (AdamW)
- Root Mean Square Propagation (RMSprop)
- Broyden-Fletcher-Goldfarb-Shanno Method (BFGS)
- Numerical Optimization, Nocedal & Wright.
- Practical Quasi-Newton Methods for Training Deep Neural Networks, Goldfarb, et al.
- Kronecker-factored Quasi-Newton Methods for Deep Learning, Yi Ren, et al.