From 1d91886bee04e641be12b2f87b81e0a9074b7c72 Mon Sep 17 00:00:00 2001 From: YangTao848 <31799659+YangTao848@users.noreply.github.com> Date: Thu, 18 Oct 2018 21:56:47 +0800 Subject: [PATCH] Update Neural+machine+translation+with+attention+-+v3.ipynb update:a = Bidirectional(LSTM(units=n_a, return_sequences=True))(X) in def model(Tx, Ty, n_a, n_s, human_vocab_size, machine_vocab_size), so that attention_map can work correctly! --- .../Week3/Neural+machine+translation+with+attention+-+v3.ipynb | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/Sequence Models/Week3/Neural+machine+translation+with+attention+-+v3.ipynb b/Sequence Models/Week3/Neural+machine+translation+with+attention+-+v3.ipynb index 1cffcf8..c7c165e 100644 --- a/Sequence Models/Week3/Neural+machine+translation+with+attention+-+v3.ipynb +++ b/Sequence Models/Week3/Neural+machine+translation+with+attention+-+v3.ipynb @@ -428,7 +428,7 @@ " ### START CODE HERE ###\n", " \n", " # Step 1: Define your pre-attention Bi-LSTM. Remember to use return_sequences=True. (≈ 1 line)\n", - " a = Bidirectional(LSTM(n_a, return_sequences=True))(X)\n", + " a = Bidirectional(LSTM(units=n_a, return_sequences=True))(X)\n", " \n", " # Step 2: Iterate for Ty steps\n", " for t in range(Ty):\n",