Code of our paper Deep Learning and genetic algorithms for cosmological Bayesian inference speed-up, accepted in Physical Review D, available at https://arxiv.org/abs/2405.03293.
- neuralike/
- NeuraLike.py.- Main class, gathers all other classes.
- NeuralManager.py.- API class, Manager for neural networks to learn likelihood function over a grid.
- NeuralNet.py.- Class with neural net architecture in PyTorch.
- RandomSampling.py.- Creates random samples in the parameter space and evaluates the likelihood in them. This is used to generate the training set for a neural network.
- pytorchtools.py.- Methods and utilities for PyTorch.
In the branch neuralike of the repository https://github.com/igomezv/simplemc_tests it is available neuralike integrated within the dynesty library for nested sampling within the SimpleMC cosmological parameter estimation code (https://igomezv.github.io/SimpleMC/).
We based our work on the following external codes:
- Philosophy of the method
- Nested sampling library
- Cosmological parameter estimation
- Genetic algorithms library
- Deep learning library
If you use this work in your research, please cite:
@article{neuralike,
title={Deep Learning and genetic algorithms for cosmological Bayesian inference speed-up},
author={G{\'o}mez-Vargas, Isidro and V{\'a}zquez, J Alberto},
journal={arXiv preprint arXiv:2405.03293},
year={2024}
}
If you find useful our nnogada
framework for hyperparameter tuning of neural networks with genetic algorithms:
@article{nnogada,
title={Neural networks optimized by genetic algorithms in cosmology},
author={Gómez-Vargas, I. and Andrade, J. B. and Vázquez, J. A.},
journal={Physical Review D},
volume={107},
number={4},
pages={043509},
year={2023},
publisher={American Physical Society},
doi={https://doi.org/10.1103/PhysRevD.107.043509},
url={https://doi.org/10.48550/arXiv.2209.02685}
}