Search Unity

Obstacle Avoidance - Observations

Discussion in 'ML-Agents' started by dani_kal, Jun 12, 2020.

  1. dani_kal

    dani_kal

    Joined:
    Mar 25, 2020
    Posts:
    52
    Hello!!!
    I would like to ask when we want to avoid a static obstacle in the scene, what we have to include in the CollectObservations() function?

    My aim is to make an agent navigate in the scene while avoiding obstacles.
    I have trained my agent to navigate from one initial point to another without obstacles with this observations:

    Code (CSharp):
    1.  Vector3 dirToTarget = (goal - myobject.transform.position).normalized;
    2.             AddVectorObs(_objects[i].transform.InverseTransformDirection(dirToTarget));
    3.                AddVectorObs(_objects[i].transform.InverseTransformPoint(goal));
    4.                AddVectorObs(_objects[i].transform.InverseTransformPoint(myobject.transform.position));
    When there are static obstacles I have problem to train the agent successfully.
    Which observations I have to include?
    I have tried the raycast but with no success.
    I have this function when a collision occurs.

    Code (CSharp):
    1.  void OnCollisionEnter(Collision other)
    2.        {
    3.             if(other.collider.CompareTag("Pedestrian"))
    4.             {
    5.                     AddReward(-1f);
    6.                     Debug.Log("collision between agent-obstacle" );
    7.                     Done();
    8.            }  
    9.    }

    Thank you in advance!!!
     
  2. Zephus

    Zephus

    Joined:
    May 25, 2015
    Posts:
    356
    First of all - you seem to be using an older version of ML-Agents? Is AddVectorObs even part of the API anymore?
    And to solve your problem you might want to look into Ray Perception 3D. "I have tried with raycast but with no success" could have so many reasons that your statement doesn't really mean that it wouldn't work.
     
    vincentgao88 likes this.
  3. dani_kal

    dani_kal

    Joined:
    Mar 25, 2020
    Posts:
    52
    Hello and thank you for your answer!
    Yes I use ML-Agents 0.10.
    I have added in the CollectObservations() function the observations for the obstacle.
    Code (CSharp):
    1.  
    2. const float rayDistance = 1f;
    3. string[] detectableObjects = {  "obstacle" };
    4. float[] rayAngles = { 20f, 90f, 160f, 45f, 135f, 70f, 110f };
    5. AddVectorObs(m_RayPer.Perceive(rayDistance, rayAngles, detectableObjects, 0f, 0f));

    I thought that to train my agent avoid the obstacles I have to give these observations and the others that I have mentioned in the first comment and the reward = -1 when it collides with the obstacle.
    But unfortunately my thought is not correct.
     
  4. Zephus

    Zephus

    Joined:
    May 25, 2015
    Posts:
    356
    Just update to the newest version. I don't know why you're still using one from October 2019. There's an entire Component dedicated to what you're trying to do without having to write any code. I'm pretty sure your rays aren't doing exactly what you want.
     
    vincentgao88 likes this.
  5. dani_kal

    dani_kal

    Joined:
    Mar 25, 2020
    Posts:
    52
    ok !!I will try it!!!
    Thank you!!!
     
  6. andrzej_

    andrzej_

    Joined:
    Dec 2, 2016
    Posts:
    81
    also unless the collision is an episode ending event, setting the negative reward to -1 is a pretty 'harsh panishment' for the agent, if you're only try to teach it to navigate efficiently. From what I remember a big negative value for cumulative reward makes it harder to train.
     
    dani_kal and vincentgao88 like this.
  7. dani_kal

    dani_kal

    Joined:
    Mar 25, 2020
    Posts:
    52
    Unfortunately, I have not been able to properly train my agent to avoid obstacles. I just want him to make a maneuver and continue on his way to the goal.
    I have try to punish him with reward = -1 and reset when he hits on the obstacle.
    I have also tried to give a smaller punishment like -0.1 or -0.25. But with no success. I dont know what else to do..
    If someone could help me I would appreciate it!
     
  8. TreewoodStudios

    TreewoodStudios

    Joined:
    Feb 17, 2014
    Posts:
    1
    I am having the same issue. I am using the most recent version and a combination of some observations, including where the agent should be going to, and the 3d sensors but the agent always tries to go through the obstacle instead of around it.