Version 0.6.1 of theanets is now live!
pip install -U theanets
http://pypi.python.org/pypi/theanets
http://theanets.readthedocs.org
http://github.com/lmjohns3/theanets
The biggest change in this release series is a Network/Layer refactor that preserves the existing API but permits much more flexible network layouts if desired. Layers can now output multiple values; by default most layer types generate an "out" (the traditional layer output) as well as a "pre" (the layer's pre-activation value). Other notable changes include:
- The semantics of the "rect:min" and "rect:max" activations has been reversed -- rect:min now gives g(z) = min(1, z) and rect:max now gives g(z) = max(0, z). The "relu" activation still means g(z) = max(0, z).
- Theanets now uses Travis CI and Coveralls.io to build and compute test coverage automatically -- see https://travis-ci.org/lmjohns3/theanets and https://coveralls.io/r/lmjohns3/theanets. Test coverage increased from 76 to 91%.
- The documentation has been expanded and hopefully made more clear. There's always more room for improvement here!
- Activation functions are now first-class objects. New activation functions include Prelu, LGrelu, and Maxout.
- Loading and saving uses the standard pickle module.
- Almost all of the trainers have moved to a new package, see http://downhill.readthedocs.org.
As a reminder, the 0.7.x release series will incorporate several big changes, but most important is that recurrent models will reorder the axes for input/output data; see goo.gl/kXB4Db for details.
As always, I hope the library will be really useful! Please file bugs, post on the mailing list, etc. as you run into questions or issues.