Skip to content

Conversation

@IFuentesSR
Copy link

Each time that we called the fit method, the weights were randomly initiated, not allowing a warm start. Therefore, we just added a way to allow the model to resume the training.
Regards

dtype=np.float64)

self.vectors_sum_gradients = np.ones_like(self.word_vectors)
self.biases_sum_gradients = np.ones_like(self.word_biases)

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please vectors_sum_gradients and biases_sum_gradients put in the if statement

if initial_epoch == 0:
        ...
        self.vectors_sum_gradients = np.ones_like(self.word_vectors)
        self.biases_sum_gradients = np.ones_like(self.word_biases)

Because using it to update learning rate in fit_vectors

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants