Skip to content

Add batch normalization and Adam optimizer #114

Open
@rweed

Description

@rweed

Here are five things I would like to see added to neural-fortran

  1. Batch normalization layers
  2. Dropout layers (although for what I'm working on now batch normalization works much better)
  3. Adam optimizer
  4. K-fold cross-validation
  5. Linear layer for output (need this for regression)

Also, FYI. I was able to get neural-fortran to compile and run on a Cray XC40 using Cray's compilers but I had to do a little code surgery to make it work. Namely

I had to eliminate dependency on both functional (you're only using the reverse function so pulling in an entire library for just one function thats simple to replicate is overkill IMHO) and h5fortran (CCE compilers did not like some of the unlimited polymorphic logic). I just figured out what was needed and replicated that in a standalone utility.

Since I have access to a Cray with tens of thousands of processors, I would like to use it to both train very large datasets and use a keras generated model to predict thousands of potential values.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions