Search Unity

  1. Unity 6 Preview is now available. To find out what's new, have a look at our Unity 6 Preview blog post.
    Dismiss Notice
  2. Unity is excited to announce that we will be collaborating with TheXPlace for a summer game jam from June 13 - June 19. Learn more.
    Dismiss Notice
  3. Dismiss Notice

Question Incomming ROS messages seem to get buffered?

Discussion in 'Robotics' started by Haastregt, Jan 11, 2024.

  1. Haastregt


    Feb 25, 2019
    I have a ROS node that is publishing images at a steady rate of ~30Hz (verified with $ rostopic hz [topic]). I then take these images and convert them to a texture that is displayed on a canvas in front of a camera (See attached picture).

    Code (CSharp):
    1. public class SpectatorCam : MonoBehaviour
    2. {
    3.     [SerializeField] int imageWidth = 200;
    4.     [SerializeField] int imageHeight = 200;
    6.     private Texture2D texRos;
    7.     public string image_topic_name = "/isaac_ros_camera";
    8.     public RawImage display;
    10.     private ImageMsg img;
    12.     IEnumerator Start()
    13.     {
    14.         texRos = new Texture2D(imageWidth, imageHeight, TextureFormat.RGB24, false);
    15.         display.texture = texRos;  
    16.         display.SetNativeSize();
    18.         yield return new WaitUntil(() => GameManager.Instance.ros != null);
    19.         GameManager.Instance.ros.Subscribe<ImageMsg>(image_topic_name, ImageCallback);
    20.     }
    22.     void OnDestroy()
    23.     {
    24.         GameManager.Instance.ros.Unsubscribe(image_topic_name);
    25.     }
    27.     void ImageCallback(ImageMsg msg)
    28.     {
    29.         texRos.LoadRawTextureData(;
    30.         texRos.Apply();
    31.     }
    33. }
    In the editor, this works fine, I see a fluent video stream. In a PC build, this also works.

    However, I am trying to get this to work on a VR headset. When I run it on VR however, I get a low framerate of the canvas (it feels around 10fps), and moreover, the video-stream gets out of sync with the canvas as time progresses. So when something the video changes at lets say 3 seconds, the display gets updated at e.g. 9 seconds. And when I stop the image messages being broadcasted after 10 seconds, I still receive another 20 seconds of images in the VR before it stops. So it is like the whole video stream is played in slow motion

    This is highly confusing because of the following reasons:
    - The queue size of the topic is 1. So there shouldn't be any messages that are getting buffered and played back at a slower rate, if this were an issue of the hardware not being fast enough I would just expect images to be skipped.
    - The fps in the VR is still steady at 70, it is really just the images on the canvas that are updated slowly. However as I understand it messages are handled in ROSTCPConnector.Update(), so if this would be slow I should have noticed a drop in framerate right?
    - I timed the ImageCallback() function and this is taking 0-1ms in the VR build. So indeed as expected the image rendering on the canvas should really be fast enough (the pictures are only 200x200 pixels anyways).

    Are there any suggestions as to what could be the cause of these buffered images? Or some methods to better debug what is going on in the TCP connector for a build (as it works fine in the editor).