Search Unity

  1. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Question Adding actions during training

Discussion in 'ML-Agents' started by smhehill, Feb 18, 2022.

  1. smhehill


    Jan 8, 2022
    Hello everyone,

    I'm currently working on a project where we are trying to control a robotic arm (4DOF) with reinforcement learning.

    The environment consists of the robot (agent) and some colored objects. The main goal is to teach him to reach to the right color and pick up the object via a pump/suction to deliver it to the right goal-position. However, now I would like to teach him to reach to a random target and manipulate it (pick it up and drop it at certain position).

    To collect observations, I setup some RayPerception sensors at the base and two of them at the pump. The sensors can recognize the target and the platform (to prevent to crash into it). My target is placed random within a certain range which increases alongside the curriculum.

    Now I would like to first make him learn to reach to the target and if he is able to do it reliably, I would like to allow him to try to pick up the target. So, I need to add an action (suction on/off) during my training at a certain lesson. But I don’t think this is even possible because my model would change significantly at this point.

    Is there another way to do it? Have you done something similar and can give me some other suggestions?
  2. mbaske


    Dec 31, 2017
    smhehill and ChillX like this.