======
- The library automates the creation of the neural network from a user defined number of input perceptrons, number of hidden layers and hidden perceptrons for each layer, and the number of output nodes.
- The library currenty accepts time-series data or a list of data as input data for the network.
- Data is preprocesssed.
- The library trains the network for 80% of the randomized set of training points and tests the network for the remaining 20% as testing points.
- The neural network is evaluatied using the MSPE which is also a stopping criteria for training.
- Design: Single perceptron = single Erlang process
- Data Preprocessing: Log Normalization
- Activation Function: Sigmoid Function
- Learning Algorithm: Backpropagation
- Training Sequence: Training -> Testing -> Error Evaluation
- Evaluation Criteria: Mean Squared Prediction Error (MSPE)
GNU GPL v3
2013 MAGNUM TEAM. Mindanao State University - Iligan Institute of Technology.