Search Unity

  1. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Question Positioning of LSTMs in the network architecture

Discussion in 'ML-Agents' started by Gibser, Jul 1, 2023.

  1. Gibser


    Dec 15, 2020
    Hi, I was trying to figure out where the recurrent layer was positioned in the neural network structure for the policy and value function and, reading the code on github, I understand that the recurrent layer with the LSTMs is positioned right after the network built with num_layers and hidden_units specified in the .yaml file.
    So, for example, if I have
    • num_layers: 2
    • hidden_units: 128
    • memory:
      • memory_size: 128
      • sequence_length: 64
    I will have a network of 2 layers with 128 neurons each (which will be LinearEncoder or ConditionalEncoder in the python code) followed by one recurrent layer (by default, only one recurring level is set in the code if I checked correctly) with 64 features (
    memory_size / 2
    ) in the hidden state that will output the actions (in the case of the policy) to be performed. Is this correct or am I skipping something?