Search Unity

Non-constant Learning Rate graph on Tensorboard explanation

Discussion in 'ML-Agents' started by Evercloud, May 9, 2020.

  1. Evercloud

    Evercloud

    Joined:
    Apr 29, 2013
    Posts:
    15
    Hey everybody! :)

    Just a quick question: I thought Tensorboard Leaning Rate graph was supposed to be constant, can somebody please help me understand the attached picture?
    Thanks!
     

    Attached Files:

  2. Evercloud

    Evercloud

    Joined:
    Apr 29, 2013
    Posts:
    15
    I have been told that "learning rate normally starts high and then decays with more steps. if you stop and start training and change some settings you will see this sort of thing".
    Although I can't remember stopping that specific training so many times, it already is a good explanation.
    Yet I do not understand why spikes are sometimes positive and sometimes negative, almost alternately. The scale of the y axys itself is always the same, it seems almost like a tensorboard wildness.
    Any in-depth explanation would still be most appreciated. :)
     
    Last edited: May 11, 2020
  3. andrewcoh_unity

    andrewcoh_unity

    Unity Technologies

    Joined:
    Sep 5, 2019
    Posts:
    162
    ML-Agents supports two learning rate schedules: constant and linear. A constant schedule means that the learning rate remains fixed for the entirety of an experiment. A linear schedule means that the learning rate is decayed overtime. The idea is that a learning rate should decay as the policy converges to an optimal solution. However, it is sometimes the case that the problem may change significantly as the experiment continues (as in self-play) and so keeping the learning rate fixed allows an agent to adapt more readily later on in experiments.

    What you see in that figure is tensorboard just behaving badly with a constant value. See the y-axis is the same value.
     
    iffalseelsetrue and Evercloud like this.
  4. Evercloud

    Evercloud

    Joined:
    Apr 29, 2013
    Posts:
    15
    Thank you for the explanation, crystal clear!