Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice
  2. Unity 6 Preview is now available. To find out what's new, have a look at our Unity 6 Preview blog post.
    Dismiss Notice
  3. Unity is excited to announce that we will be collaborating with TheXPlace for a summer game jam from June 13 - June 19. Learn more.
    Dismiss Notice

Official Introducing ROS 2 Support and the AMR Demo

Discussion in 'Robotics' started by amanda-unity, Aug 12, 2021.

  1. amanda-unity

    amanda-unity

    Unity Technologies

    Joined:
    May 29, 2020
    Posts:
    19
    goal_0.gif

    We have some exciting updates to share!

    Our Robotics packages now provide support for ROS 2, showcased with our new Autonomous Mobile Robot (AMR) Example Project. In this example project, you'll see Unity used as the simulation environment for the robot's task of navigating while mapping (SLAM). The project features a fully articulated Turtlebot 3, simulated LIDAR, warehouse environment, and, of course, ROS 2 support, complete with a colcon workspace and Dockerfile to get you started quickly.

    How do I get started?
    You can get started with the AMR Example Project here! Here’s where you will find steps on how to configure your development environment, set up your Unity project, and run the example.

    For general ROS 2 usage, you can get started with our ROS–Unity Integration tutorials here, where you can find tutorials for setting up publishers, subscribers, services, and other networking details, updated with details for both ROS 1 and ROS 2 integration.

    Thanks for the support!
     
    Ivan_br likes this.
  2. Ivan_br

    Ivan_br

    Joined:
    Nov 10, 2015
    Posts:
    29
    This is a really great. Congratulations to the team with all of these great updates and examples.

    Are there any plans to create an example of semiautonomous control with other forms of user input? For instance, the user could specify the robot's direction and speed via the use of keyboard/joystick and the system will handle obstacle avoidance while trying to maintain the user's desired path?
     
    Last edited: Sep 14, 2021
  3. amanda-unity

    amanda-unity

    Unity Technologies

    Joined:
    May 29, 2020
    Posts:
    19
    Hi @Ivan_br, we don't have an example of the semiautonomous control you described, but you could implement this with the same APIs used in the AMR Example project. As a potential implementation, you could write a controller that sends short-ranged /goal_poses in the direction of keyboard/joystick input. When a wall or obstacle is detected, the goal pose can be modified to be parallel to the obstacle instead of directly in the direction of movement (similar to games that allow the character to walk at full speed along a wall rather than cancelling an axis of movement, or stopping altogether).
     
    Ivan_br likes this.
  4. wechat_os_Qy0_cxJ8yR7FqpemIamz4YH08

    wechat_os_Qy0_cxJ8yR7FqpemIamz4YH08

    Joined:
    Sep 8, 2021
    Posts:
    4
    Hey Amanda ! You guys are doing a great job on the simulation part . I have a problem tho , I want to transfer the slam map in Rviz to Unity , but I don't know hot to do it , is there a package like UnityRviz that can show the map of Rviz in Unity in real-time ? I'd really appreciate it if you can reply !
     
  5. amanda-unity

    amanda-unity

    Unity Technologies

    Joined:
    May 29, 2020
    Posts:
    19
    @wechat_os_Qy0_cxJ8yR7FqpemIamz4YH08--not yet, but it's on our roadmap! This feature will be coming out soon. We'll make an announcement here on the forum when it's available.
     
  6. wechat_os_Qy0_cxJ8yR7FqpemIamz4YH08

    wechat_os_Qy0_cxJ8yR7FqpemIamz4YH08

    Joined:
    Sep 8, 2021
    Posts:
    4
    @amanda-unity Thanks for replying !!! Looking forward to seeing your work !
     
  7. wechat_os_Qy0_cxJ8yR7FqpemIamz4YH08

    wechat_os_Qy0_cxJ8yR7FqpemIamz4YH08

    Joined:
    Sep 8, 2021
    Posts:
    4
    And I have another question , how long does it take to publish this UnityRviz package ? Do you guys have a timeline ?
     
  8. Robotawi

    Robotawi

    Joined:
    Sep 9, 2021
    Posts:
    2
    Thank you @amanda-unity

    I have some questions. An essential part of simulations is sensors. I have been using Gazebo for a long time, and it simulates sensors via plugins. I want to know Unity's way of simulating sensors. Is it like Gazebo? Can I use Gazebo sensors included in the robot URDF model in Unity?

    I have a mobile robot equipped with many types of sensors (camera, range sensors, and encoders) and I want to migrate from Gazebo to Unity. What are the possibilities and limitations at the current stage?

    Feel free to recommend me something to read.

    Thank you very much!
     
  9. amanda-unity

    amanda-unity

    Unity Technologies

    Joined:
    May 29, 2020
    Posts:
    19
    @wechat_os_Qy0_cxJ8yR7FqpemIamz4YH08, unfortunately, we’re unable to comment on the timeline, but we’ll make sure to keep you all posted on it.

    @Robotawi, while I also can’t comment on the exact timelines on sensors, I can tell you that this is also on our roadmap! Please watch out for updates on this forum. If you’d like to sign up to be an early user of our products (robotics, AI/ML, CV), you can sign up here!
     
  10. Robotawi

    Robotawi

    Joined:
    Sep 9, 2021
    Posts:
    2
    Thanks, @amanda-unity
    If the ongoing effort for making sensors is open-source, I would love to contribute to this.
     
  11. wechat_os_Qy0_cxJ8yR7FqpemIamz4YH08

    wechat_os_Qy0_cxJ8yR7FqpemIamz4YH08

    Joined:
    Sep 8, 2021
    Posts:
    4
  12. Ivan_br

    Ivan_br

    Joined:
    Nov 10, 2015
    Posts:
    29
    Hi @amanda-unity, sorry for the late reply. Thank you for the information on this. Do you think it would be better to send goal_poses or to send velocity (speed and direction) information instead? I assume goal_poses in this case is more useful because the goal_pose can be checked against possible upcoming collisions?

    @Robotawi, I'm not sure if this is what you are looking for, but the sensor are being developed in the Unity's Urdf github repository - https://github.com/Unity-Technologies/URDF-Importer

    If you look at the project's branches, you can see the information about how the sensors are being developed. This might be a good place to start as it is open-source.

    I'm not sure though if there is a public roadmap on what is still going to be developed so you could know what to help on, so you may have to ask that team, maybe by posting a message on the project's issue page?

    There are also these two documentation pages from Unity on sensors, but I am not sure if this part is open-source and open to help from outside of Unity.

    SensorSDK: https://docs.unity3d.com/Packages/com.unity.sensorsdk@1.0/manual/index.html
    SystemGraph: https://docs.unity3d.com/Packages/com.unity.systemgraph@1.0/manual/index.html
     
  13. amanda-unity

    amanda-unity

    Unity Technologies

    Joined:
    May 29, 2020
    Posts:
    19
    Hey @Ivan_br--yeah, that sounds reasonable to me so that you can check the poses against collisions rather than potentially needing to do the calculations yourself with the velocity messages. Let us know how it goes if you try this out!
     
    Ivan_br likes this.
  14. Ivan_br

    Ivan_br

    Joined:
    Nov 10, 2015
    Posts:
    29
    Hey @amanda-unity, sounds great, that's what I thought. If I have some time to try this out I'll let you know.