Description
Here are five things I would like to see added to neural-fortran
- Batch normalization layers
- Dropout layers (although for what I'm working on now batch normalization works much better)
- Adam optimizer
- K-fold cross-validation
- Linear layer for output (need this for regression)
Also, FYI. I was able to get neural-fortran to compile and run on a Cray XC40 using Cray's compilers but I had to do a little code surgery to make it work. Namely
I had to eliminate dependency on both functional (you're only using the reverse function so pulling in an entire library for just one function thats simple to replicate is overkill IMHO) and h5fortran (CCE compilers did not like some of the unlimited polymorphic logic). I just figured out what was needed and replicated that in a standalone utility.
Since I have access to a Cray with tens of thousands of processors, I would like to use it to both train very large datasets and use a keras generated model to predict thousands of potential values.