Search Unity

CameraStream: Get locatable camera video frames in Unity

Discussion in 'VR' started by erics_vulcan, Apr 12, 2017.

  1. erics_vulcan

    erics_vulcan

    Joined:
    Apr 2, 2016
    Posts:
    4
    Our team wanted access to the HoloLens' video camera so that we could grab locatable frame data from the camera in realtime. We found that, out of the box, Unity only supports saving videos directly to disk through their WSA.WebCam.VideoCapture class.



    I'm sure this functionality will be coming to Unity sometime soon, but until then we created a Unity plugin that will stream locatable video frames to memory as they arrive. We chose to follow a similar API to Unity's VideoCapture class so that developers would be familiar with how to implement this.

    CameraStream for HoloLens and Unity

    We just put this up, so we'd love feedback and support in building out some of the missing features! You can get the .unitypackage from the Releases page in the link above.

    We hope this is helpful to some folks.
     
    razielblood, sgandhe and Syntex91 like this.
  2. Syntex91

    Syntex91

    Joined:
    Sep 13, 2017
    Posts:
    3
    Hi,
    thanks for sharing the plugin!
    For my current project (where I need access to raw frame data in video mode) I want to start and stop the video mode using Air-Tap. Starting and stopping it the first time works fine. Just restarting it without restarting the whole application does not work.
    This is the code I'm trying to get to work: (it's basically code from here and here and just shows the user what the camera sees)

    Code (CSharp):
    1. using UnityEngine;
    2. using UnityEngine.VR.WSA;
    3. using UnityEngine.VR.WSA.Input;
    4.  
    5. using System;
    6. using System.Collections;
    7.  
    8. using HoloLensCameraStream;
    9.  
    10.  
    11. public class VideoModeExample : MonoBehaviour
    12. {
    13.     byte[] _latestImageBytes;
    14.     HoloLensCameraStream.Resolution _resolution;
    15.  
    16.     GameObject _videoPanelUI;
    17.     Renderer _videoPanelUIRenderer;
    18.     Texture2D _videoTexture;
    19.     VideoCapture _videoCapture;
    20.  
    21.     private bool stopVideo;
    22.     private GestureRecognizer _gestureRecognizer;
    23.  
    24.     IntPtr _spatialCoordinateSystemPtr;
    25.  
    26.  
    27.     // struct to store frame related data
    28.     private class SampleStruct
    29.     {
    30.         public float[] cameraToWorldMatrixAsFloat;
    31.         public float[] projectionMatrixAsFloat;
    32.         public byte[] data;
    33.     }
    34.  
    35.     void Awake()
    36.     {
    37.         // create and set the gesture recognizer
    38.         _gestureRecognizer = new GestureRecognizer();
    39.         _gestureRecognizer.TappedEvent += (source, tapCount, headRay) => { Debug.Log("Tapped");   StartCoroutine(StopVideoMode()); };
    40.         _gestureRecognizer.SetRecognizableGestures(GestureSettings.Tap);
    41.         _gestureRecognizer.StartCapturingGestures();
    42.     }
    43.  
    44.  
    45.     void Start()
    46.     {
    47.         // fetch a pointer to Unity's spatial coordinate system if you need pixel mapping
    48.         _spatialCoordinateSystemPtr = WorldManager.GetNativeISpatialCoordinateSystemPtr();
    49.  
    50.         // call this in Start() to ensure that the CameraStreamHelper is already "Awake".
    51.         CameraStreamHelper.Instance.GetVideoCaptureAsync(OnVideoCaptureCreated);
    52.  
    53.  
    54.         _videoPanelUI = GameObject.CreatePrimitive(PrimitiveType.Quad);
    55.         _videoPanelUI.name = "VideoPanelUI";
    56.         _videoPanelUIRenderer = _videoPanelUI.GetComponent<Renderer>() as Renderer;
    57.         _videoPanelUIRenderer.material = new Material(Shader.Find("AR/HolographicImageBlend"));
    58.  
    59.     }
    60.  
    61.     // coroutine to toggle the video on/off
    62.     private IEnumerator StopVideoMode()
    63.     {
    64.         yield return new WaitForSeconds(0.65f);
    65.         stopVideo = !stopVideo;
    66.  
    67.         if (!stopVideo)
    68.         {
    69.             OnVideoCaptureCreated(_videoCapture);            
    70.         }
    71.     }
    72.  
    73.     private void OnDestroy()
    74.     {
    75.         if (_videoCapture != null)
    76.         {
    77.             _videoCapture.FrameSampleAcquired -= OnFrameSampleAcquired;
    78.             _videoCapture.Dispose();
    79.         }
    80.     }
    81.  
    82.     void OnVideoCaptureCreated(VideoCapture videoCapture)
    83.     {
    84.         if (videoCapture == null)
    85.         {
    86.             Debug.LogError("Did not find a video capture object. You may not be using the HoloLens.");
    87.             return;
    88.         }
    89.  
    90.         _videoCapture = videoCapture;
    91.  
    92.         // request the spatial coordinate ptr if you want fetch the camera and set it if you need to
    93.         CameraStreamHelper.Instance.SetNativeISpatialCoordinateSystemPtr(_spatialCoordinateSystemPtr);
    94.  
    95.         _resolution = CameraStreamHelper.Instance.GetLowestResolution();
    96.  
    97.         float frameRate = CameraStreamHelper.Instance.GetHighestFrameRate(_resolution);
    98.  
    99.         videoCapture.FrameSampleAcquired += OnFrameSampleAcquired;
    100.  
    101.         // camera parameters
    102.         CameraParameters cameraParams = new CameraParameters();
    103.         cameraParams.cameraResolutionHeight = _resolution.height;
    104.         cameraParams.cameraResolutionWidth = _resolution.width;
    105.         cameraParams.frameRate = Mathf.RoundToInt(frameRate);
    106.         cameraParams.pixelFormat = CapturePixelFormat.BGRA32;
    107.         cameraParams.rotateImage180Degrees = false;
    108.         cameraParams.enableHolograms = false;
    109.  
    110.  
    111.         UnityEngine.WSA.Application.InvokeOnAppThread(() => { _videoTexture = new Texture2D(_resolution.width, _resolution.height, TextureFormat.BGRA32, false); }, false);
    112.         videoCapture.StartVideoModeAsync(cameraParams, OnVideoModeStarted);
    113.     }
    114.  
    115.     void OnVideoModeStarted(VideoCaptureResult result)
    116.     {
    117.         if (result.success == false)
    118.         {
    119.             Debug.LogWarning("Could not start video mode.");
    120.             return;
    121.         }
    122.  
    123.         Debug.Log("VideoMode started.");
    124.     }
    125.  
    126.     void OnFrameSampleAcquired(VideoCaptureSample sample)
    127.     {
    128.         SampleStruct s = new SampleStruct();
    129.  
    130.         // allocate byteBuffer
    131.         if (_latestImageBytes == null || _latestImageBytes.Length < sample.dataLength)
    132.         {
    133.             _latestImageBytes = new byte[sample.dataLength];
    134.         }
    135.  
    136.         // fill frame struct
    137.         sample.CopyRawImageDataIntoBuffer(_latestImageBytes);
    138.         s.data = _latestImageBytes;
    139.  
    140.  
    141.         // get the cameraToWorldMatrix
    142.         if (sample.TryGetCameraToWorldMatrix(out s.cameraToWorldMatrixAsFloat) == false)
    143.         {
    144.             return;
    145.         }
    146.  
    147.         // get the projectionMatrix
    148.         if (sample.TryGetProjectionMatrix(out s.projectionMatrixAsFloat) == false)
    149.         {
    150.             return;
    151.         }
    152.  
    153.         sample.Dispose();
    154.  
    155.         // right now we pass things across the pipe as a float array then convert them back into UnityEngine.Matrix using a utility method
    156.         Matrix4x4 cameraToWorldMatrix = LocatableCameraUtils.ConvertFloatArrayToMatrix4x4 (s.cameraToWorldMatrixAsFloat);
    157.         Matrix4x4 projectionMatrix = LocatableCameraUtils.ConvertFloatArrayToMatrix4x4(s.projectionMatrixAsFloat);
    158.  
    159.         // this is where we actually use the image data
    160.         UnityEngine.WSA.Application.InvokeOnAppThread(() =>
    161.         {
    162.             _videoTexture.LoadRawTextureData(_latestImageBytes);
    163.             _videoTexture.wrapMode = TextureWrapMode.Clamp;
    164.             _videoTexture.Apply();
    165.  
    166.             _videoPanelUIRenderer.sharedMaterial.SetTexture("_MainTex", _videoTexture);
    167.             _videoPanelUIRenderer.sharedMaterial.SetMatrix("_WorldToCameraMatrix", cameraToWorldMatrix.inverse);
    168.             _videoPanelUIRenderer.sharedMaterial.SetMatrix("_CameraProjectionMatrix", projectionMatrix);
    169.             _videoPanelUIRenderer.sharedMaterial.SetFloat("_VignetteScale", 1.3f);
    170.  
    171.  
    172.             Vector3 inverseNormal = -cameraToWorldMatrix.GetColumn(2);
    173.             // position the canvas object slightly in front of the real world web camera.
    174.             Vector3 imagePosition = cameraToWorldMatrix.GetColumn(3) - cameraToWorldMatrix.GetColumn(2);
    175.  
    176.             _videoPanelUI.gameObject.transform.position = imagePosition;
    177.             _videoPanelUI.gameObject.transform.rotation = Quaternion.LookRotation(inverseNormal, cameraToWorldMatrix.GetColumn(1));
    178.  
    179.         }, false);
    180.  
    181.  
    182.         if (stopVideo)
    183.         {
    184.             _videoCapture.StopVideoModeAsync(onVideoModeStopped);
    185.         }
    186.  
    187.     }
    188.  
    189.     private void onVideoModeStopped(VideoCaptureResult result)
    190.     {
    191.         Debug.Log("VideoMode stopped!");
    192.     }
    193. }
    194.  



    After the video-mode stops, using another Air-Tap to start video-mode again, I get the following:
    You're integrating from APP thread, call item directly instead.
    when line 111
    Code (CSharp):
    1. UnityEngine.WSA.Application.InvokeOnAppThread(() => { _videoTexture = new Texture2D(_resolution.width, _resolution.height, TextureFormat.BGRA32, false); }, false);
    is executed again - and the App crashes.

    Any idea to make the start/stop toggling work?
    Thanks in advance
     
    Last edited: Dec 11, 2017
  3. labellson

    labellson

    Joined:
    Oct 24, 2017
    Posts:
    1
    Hi, I'm executed your VideoModeExample without the error message. Maybe the problem is in Unity version??
    Anyways, I'm not that familiar with Unity threads. But replacing line 111 with the snippet below should work.
    Code (CSharp):
    1. if (_videoTexture == null)
    2. {
    3.    UnityEngine.WSA.Application.InvokeOnAppThread(() => { _videoTexture = new Texture2D(_resolution.width, _resolution.height, TextureFormat.BGRA32, false); }, false);
    4. }
    This is because, firstly, in the workflow, OnVideoCaptureCreated is invoked outside the app thread. That's the reason why you need to call

    Code (CSharp):
    1. UnityEngine.WSA.Application.InvokeOnAppThread(() => { _videoTexture = new Texture2D(_resolution.width, _resolution.height, TextureFormat.BGRA32, false); }, false);
    When you make tap for the first time the camera will stop. Again outside the app thread.
    Code (CSharp):
    1. if (stopVideo)
    2. {
    3.     _videoCapture.StopVideoModeAsync(onVideoModeStopped);
    4. }
    The second time you make tap to restart the camera. The Unity StopVideoMode coroutine will execute and should call the line 69 starting again the stream. I think this time the OnVideoCaptureCreated function it's being invoked inside the app thread. Therefore, that's the reason you are seeing this error code.

    So, I think that replacing this line with the above code should work.
    Hope this helps you.
     
    Last edited: Dec 12, 2017
    Syntex91 likes this.
  4. Syntex91

    Syntex91

    Joined:
    Sep 13, 2017
    Posts:
    3
    Heya,

    thank you for your Reply. I tried your snippet too. But that yields to another exception in Line 137
    Code (CSharp):
    1. sample.CopyRawImageDataIntoBuffer(_latestImageBytes);
    System.Object.DisposedException: 'The object has been closed.'


    Then tried another Unity Version, 2017.1.0f3 (I usually use 5.6.2f1), same problem.

    However, I found out the reason.
    It's because I deployed in Debug to HoloLens instead of Release or Master. Yea I know that that is not how Microsoft describes it in their docs, but well I haven't even thought about that, since I never experienced such a big difference between Debug and Release, that it causes a whole app not to work. So far I experienced only Perfomance loss in Debug compared to Release.

    Deployed in Release and everything works fine, Independent of the Unity Version (tested both) or your code snippet

    Code (CSharp):
    1. if (_videoTexture == null)
    2. {
    3.    UnityEngine.WSA.Application.InvokeOnAppThread(() => { _videoTexture = new Texture2D(_resolution.width, _resolution.height, TextureFormat.BGRA32, false); }, false);
    4. }
    FYI: Building it in Unity 2017.1.0f3 and deploying in Debug to HoloLens won't even let the app start properly. It instantly crashes right after line 111 has been called the first time. Doing the same but with Unity 5.6.2f1 at least starts and stops the video-mode for one time.
    Can't find an explanation for that...

    However, thank you for your help!
     
    Last edited: Dec 12, 2017
  5. Xyy_1209

    Xyy_1209

    Joined:
    Nov 12, 2017
    Posts:
    6
    When i pay the scene in the unity edito ,there is a probelm like this:

    Assertion failed: Failed to initialize IMediaCapture (hr = 0xC00DABE0)

    Is here someone to help me?Thanks!
     
  6. bdiazvr

    bdiazvr

    Joined:
    Dec 1, 2021
    Posts:
    1

    unity cant access the the hololens camera even if it is connected via usb or remoting,
    so basically it cant find a capture device,
    a work around for this is to simply connect a webcam to the pc you are working on and unity will consider that the hololens webcam.