Skip to content

Conversation

@linusmartensson
Copy link

@linusmartensson linusmartensson commented Mar 9, 2018

So, I had a problem with two models that led me to implement this.
In the first one, I had a free-form batch shape, which isn't compatible with keras-js, and in the second one, automatic parameter renaming in newer versions of Keras, which renamed a node from "class_scores_1" to "class_scores", but maintained the original weight name, "class_scores_1/kernel:0", was unable to load.

By implementing this adjustment layer, I can identify such error conditions, and manually add any necessary corrections, without having to wait for such inconsistencies to be resolved, or mess around trying different versions:

const detector = new KerasJS.Model({
    filepath: 'detector.bin',
    gpu:true,
    adjustment: (config)=>{
        console.dir("adjusting...");
        config.config.layers[0].config.batch_input_shape[0] = 1;
        config.config.layers[0].config.batch_input_shape[1] = 360;
        config.config.layers[0].config.batch_input_shape[2] = 640;
        console.dir(config);
        return config;
    }

});
const segmentation = new KerasJS.Model({
    filepath: 'jnet.bin',
    gpu:false,
    adjustment: (config)=>{

        config.config.layers[33].config.name = "class_scores_1";
        config.config.layers[33].name = "class_scores_1";
        config.config.layers[34].inbound_nodes[0][0][0] = "class_scores_1";

        return config;
    }
});

Having this may allow users to work more consistently with the still somewhat experimental API, and I'm thinking it may be useful on a broader scale. For me, it was critical to get the implementation running.

Since keras isn't fully featured, and there are
some inconsistencies between versions that are not yet
handled, this allows us to modify the modelConfig object
during load and resolve such version inconsistencies
manually.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant