Cannot implement multiple stacked bidirectional RNNs












1














I am trying to implement a Seq2Seq variant in Tensorflow, which includes two encoders and a decoder. For the encoders' first layer, I have bidirectional LSTMs. So I have implemented this method for getting bidirectional LSTMs for variable number of layers:



def bidirectional_lstm(batch, num_layers=2, hidden_layer=256):


forward_lstms=[LSTMCell(num_units=hidden_layer/2) for _ in range(num_layers)]
backward_lstms=[LSTMCell(num_units=hidden_layer/2) for _ in range(num_layers)]

states_fw=[f_l.zero_state(BATCH_SIZE, tf.float64) for f_l in forward_lstms]
states_bw=[b_l.zero_state(BATCH_SIZE, tf.float64) for b_l in backward_lstms]


outputs, final_state_fw, final_state_bw=tf.contrib.rnn.stack_bidirectional_dynamic_rnn(
forward_lstms,
backward_lstms,
batch,
initial_states_fw=states_fw,
initial_states_bw=states_bw,
parallel_iterations=32
)

return outputs


But when I run the lines below:



a=bidirectional_lstm(a_placeholder)

b=bidirectional_lstm(b_placeholder, num_layers=1)


I get this error message:



ValueError
Variable
stack_bidirectional_rnn/cell_0/bidirectional_rnn/fw/lstm_cell/kernel
already exists, disallowed. Did you mean to set reuse=True or
reuse=tf.AUTO_REUSE in VarScope? Originally defined at: File
"/usr/local/lib/python3.6/dist-
packages/tensorflow/contrib/rnn/python/ops/rnn.py", line 233, in
stack_bidirectional_dynamic_rnn time_major=time_major)


I do not want to "reuse" a given stacked bidirectional LSTM. How can I run two separate encoders containing two stacked bidirectional LSTMs?










share|improve this question



























    1














    I am trying to implement a Seq2Seq variant in Tensorflow, which includes two encoders and a decoder. For the encoders' first layer, I have bidirectional LSTMs. So I have implemented this method for getting bidirectional LSTMs for variable number of layers:



    def bidirectional_lstm(batch, num_layers=2, hidden_layer=256):


    forward_lstms=[LSTMCell(num_units=hidden_layer/2) for _ in range(num_layers)]
    backward_lstms=[LSTMCell(num_units=hidden_layer/2) for _ in range(num_layers)]

    states_fw=[f_l.zero_state(BATCH_SIZE, tf.float64) for f_l in forward_lstms]
    states_bw=[b_l.zero_state(BATCH_SIZE, tf.float64) for b_l in backward_lstms]


    outputs, final_state_fw, final_state_bw=tf.contrib.rnn.stack_bidirectional_dynamic_rnn(
    forward_lstms,
    backward_lstms,
    batch,
    initial_states_fw=states_fw,
    initial_states_bw=states_bw,
    parallel_iterations=32
    )

    return outputs


    But when I run the lines below:



    a=bidirectional_lstm(a_placeholder)

    b=bidirectional_lstm(b_placeholder, num_layers=1)


    I get this error message:



    ValueError
    Variable
    stack_bidirectional_rnn/cell_0/bidirectional_rnn/fw/lstm_cell/kernel
    already exists, disallowed. Did you mean to set reuse=True or
    reuse=tf.AUTO_REUSE in VarScope? Originally defined at: File
    "/usr/local/lib/python3.6/dist-
    packages/tensorflow/contrib/rnn/python/ops/rnn.py", line 233, in
    stack_bidirectional_dynamic_rnn time_major=time_major)


    I do not want to "reuse" a given stacked bidirectional LSTM. How can I run two separate encoders containing two stacked bidirectional LSTMs?










    share|improve this question

























      1












      1








      1







      I am trying to implement a Seq2Seq variant in Tensorflow, which includes two encoders and a decoder. For the encoders' first layer, I have bidirectional LSTMs. So I have implemented this method for getting bidirectional LSTMs for variable number of layers:



      def bidirectional_lstm(batch, num_layers=2, hidden_layer=256):


      forward_lstms=[LSTMCell(num_units=hidden_layer/2) for _ in range(num_layers)]
      backward_lstms=[LSTMCell(num_units=hidden_layer/2) for _ in range(num_layers)]

      states_fw=[f_l.zero_state(BATCH_SIZE, tf.float64) for f_l in forward_lstms]
      states_bw=[b_l.zero_state(BATCH_SIZE, tf.float64) for b_l in backward_lstms]


      outputs, final_state_fw, final_state_bw=tf.contrib.rnn.stack_bidirectional_dynamic_rnn(
      forward_lstms,
      backward_lstms,
      batch,
      initial_states_fw=states_fw,
      initial_states_bw=states_bw,
      parallel_iterations=32
      )

      return outputs


      But when I run the lines below:



      a=bidirectional_lstm(a_placeholder)

      b=bidirectional_lstm(b_placeholder, num_layers=1)


      I get this error message:



      ValueError
      Variable
      stack_bidirectional_rnn/cell_0/bidirectional_rnn/fw/lstm_cell/kernel
      already exists, disallowed. Did you mean to set reuse=True or
      reuse=tf.AUTO_REUSE in VarScope? Originally defined at: File
      "/usr/local/lib/python3.6/dist-
      packages/tensorflow/contrib/rnn/python/ops/rnn.py", line 233, in
      stack_bidirectional_dynamic_rnn time_major=time_major)


      I do not want to "reuse" a given stacked bidirectional LSTM. How can I run two separate encoders containing two stacked bidirectional LSTMs?










      share|improve this question













      I am trying to implement a Seq2Seq variant in Tensorflow, which includes two encoders and a decoder. For the encoders' first layer, I have bidirectional LSTMs. So I have implemented this method for getting bidirectional LSTMs for variable number of layers:



      def bidirectional_lstm(batch, num_layers=2, hidden_layer=256):


      forward_lstms=[LSTMCell(num_units=hidden_layer/2) for _ in range(num_layers)]
      backward_lstms=[LSTMCell(num_units=hidden_layer/2) for _ in range(num_layers)]

      states_fw=[f_l.zero_state(BATCH_SIZE, tf.float64) for f_l in forward_lstms]
      states_bw=[b_l.zero_state(BATCH_SIZE, tf.float64) for b_l in backward_lstms]


      outputs, final_state_fw, final_state_bw=tf.contrib.rnn.stack_bidirectional_dynamic_rnn(
      forward_lstms,
      backward_lstms,
      batch,
      initial_states_fw=states_fw,
      initial_states_bw=states_bw,
      parallel_iterations=32
      )

      return outputs


      But when I run the lines below:



      a=bidirectional_lstm(a_placeholder)

      b=bidirectional_lstm(b_placeholder, num_layers=1)


      I get this error message:



      ValueError
      Variable
      stack_bidirectional_rnn/cell_0/bidirectional_rnn/fw/lstm_cell/kernel
      already exists, disallowed. Did you mean to set reuse=True or
      reuse=tf.AUTO_REUSE in VarScope? Originally defined at: File
      "/usr/local/lib/python3.6/dist-
      packages/tensorflow/contrib/rnn/python/ops/rnn.py", line 233, in
      stack_bidirectional_dynamic_rnn time_major=time_major)


      I do not want to "reuse" a given stacked bidirectional LSTM. How can I run two separate encoders containing two stacked bidirectional LSTMs?







      python-3.x tensorflow






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Nov 21 '18 at 0:03









      tdr

      62




      62
























          1 Answer
          1






          active

          oldest

          votes


















          0














          Figured it out: The two encoders need to "run" in two different variable scopes to avoid "mixup" during gradient updates



          with tf.variable_scope("a"):
          a=bidirectional_lstm(a_placeholder)


          with tf.variable_scope("b"):
          b=bidirectional_lstm(b_placeholder)





          share|improve this answer





















            Your Answer






            StackExchange.ifUsing("editor", function () {
            StackExchange.using("externalEditor", function () {
            StackExchange.using("snippets", function () {
            StackExchange.snippets.init();
            });
            });
            }, "code-snippets");

            StackExchange.ready(function() {
            var channelOptions = {
            tags: "".split(" "),
            id: "1"
            };
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function() {
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled) {
            StackExchange.using("snippets", function() {
            createEditor();
            });
            }
            else {
            createEditor();
            }
            });

            function createEditor() {
            StackExchange.prepareEditor({
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader: {
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            },
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            });


            }
            });














            draft saved

            draft discarded


















            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53403445%2fcannot-implement-multiple-stacked-bidirectional-rnns%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown

























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            0














            Figured it out: The two encoders need to "run" in two different variable scopes to avoid "mixup" during gradient updates



            with tf.variable_scope("a"):
            a=bidirectional_lstm(a_placeholder)


            with tf.variable_scope("b"):
            b=bidirectional_lstm(b_placeholder)





            share|improve this answer


























              0














              Figured it out: The two encoders need to "run" in two different variable scopes to avoid "mixup" during gradient updates



              with tf.variable_scope("a"):
              a=bidirectional_lstm(a_placeholder)


              with tf.variable_scope("b"):
              b=bidirectional_lstm(b_placeholder)





              share|improve this answer
























                0












                0








                0






                Figured it out: The two encoders need to "run" in two different variable scopes to avoid "mixup" during gradient updates



                with tf.variable_scope("a"):
                a=bidirectional_lstm(a_placeholder)


                with tf.variable_scope("b"):
                b=bidirectional_lstm(b_placeholder)





                share|improve this answer












                Figured it out: The two encoders need to "run" in two different variable scopes to avoid "mixup" during gradient updates



                with tf.variable_scope("a"):
                a=bidirectional_lstm(a_placeholder)


                with tf.variable_scope("b"):
                b=bidirectional_lstm(b_placeholder)






                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered Nov 21 '18 at 8:45









                tdr

                62




                62






























                    draft saved

                    draft discarded




















































                    Thanks for contributing an answer to Stack Overflow!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid



                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.


                    To learn more, see our tips on writing great answers.





                    Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


                    Please pay close attention to the following guidance:


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid



                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.


                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function () {
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53403445%2fcannot-implement-multiple-stacked-bidirectional-rnns%23new-answer', 'question_page');
                    }
                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Costa Masnaga

                    Fotorealismo

                    Sidney Franklin