This project attempts evolving transfer functions for neural networks using symbolic regression. Project is implemented using DeepLearning4Java.
The installation procedure can be found here
- Defines common structures and procedures for the evolution process.
- Evolutionary search algorithms and structures.
- Methods for search and analysis of the initial architecture for a given dataset.
- Custom data structures and pipelines.
- Includes
Parser
for DPAv2/4 CSV files.
- Demos on the usage of DL4J and TF frameworks.
- Various common utilities.
- Wrapper for the neural network, used to hide unnecessary features
- Softly prevents unwanted model modifications (for comparable results).
- Defines the common procedures and structures used for model training.
- Ensures comparability of experiment results.
- Defines immutable parameters used for model training.
- Ensures comparability of experiment results.
- Defines model performance on the test set.
- Workhorse for procedures related to storing and loading experiments.
- Defines the storage paths of an experiment.
- Used to run custom experiments using the DL4J model and data API
- Demonstrates the usage of the custom data pipeline.
- Demonstrates learning a neural network using DeepLearning4Java on MNIST and custom datasets.
- Demonstrates learning a neural network using Tensorflow Java Ops API.
To load data construct a pipeline.
- Parser - reads dataset description markers and passes actual data through
- Cacher - loads the whole stream into an array (for faster inference) and additionally applying some data modifiers
- Batcher - constructs batches from input stream
- DatumF - wrapper class used to construct and hold a Tensor pair (doesn't matter if batch or not)
- IModifier - interface for constructing modifiers used to modify the data (normalization, randomization before batching, etc.)
- Randomizer - randomizes the order of instances in dataset
Method get()
fetches the next data, method reset()
generally resets internals to start over the inference procedure.