Search Unity

  1. Read here for Unity's latest plans on OpenXR.
    Dismiss Notice

ML Example Scene: RawVideoCapture

Discussion in 'AR' started by smartglasses, Feb 19, 2020.

  1. smartglasses

    smartglasses

    Joined:
    Feb 19, 2020
    Posts:
    1
    Hey guys,

    I was wondering if somebody would be able to help me out with an issue I've been having. I've been trying to use one of the included Magic Leap example scenes to learn about how to capture and output video in real-time on my headset. The scene called RawVideoCapture seems to do just that; however, the video is in greyscale. I determined that this was because a shader was converting it from redscale to greyscale but I shut this off for now as I want colour video. After some research, I found that within the script RawVideoCaptureVisualizer.cs and in the method OnRawCaptureDataReceived, YUV camera data is being loaded to a texture and output to the display. It seems as though only Y out of the YUV data is being loaded though and the texture format is R8, kind of explaining the redness.

    This brings me to my question. Does anyone know how to convert three separate YUV Buffers into RGB data which I can load as a single texture?

    If you are able to provide any help on this I'd really appreciate it!

    Thanks,
    -Tyler
     

    Attached Files:

unityunity