Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

How to access raw RGB frames seen by ARCore in Unity

Discussion in 'AR' started by pradyumnp508, Aug 9, 2020.

  1. pradyumnp508

    pradyumnp508

    Joined:
    Aug 9, 2020
    Posts:
    15
    Disclaimer: I am new to Unity and only know a little about it. I have created a basic scene where I have an ARSession, ARSession Origin with ARCamera Tagged as Main Camera. I also have a Cube in the scene in front of the camera as shown:

    Annotation 2020-08-09 162509.jpg

    Question: What I wish to do is simply have the Live Raw camera feed as seen by ARCore through my phones camera and save them into a matrix that I can later use with Opencv. Also Is it Possible to add that live feed to the Cube as a texture?

    I have the basic code and libraries as below:
    Code (CSharp):
    1.  
    2. namespace OpenCvSharp
    3. {
    4.  
    5.     using System.Collections;
    6.     using System.Collections.Generic;
    7.     using UnityEngine;
    8.     using OpenCvSharp;
    9.     using UnityEngine.XR.ARFoundation;
    10.     using UnityEngine.Experimental.XR;
    11.  
    12.     public class Sourcing_Camera_Feed : MonoBehaviour
    13.     {
    14.         static WebCamTexture Cam;
    15.         private ARSessionOrigin arOrigin;
    16.         private ARSession ARSession;
    17.  
    18.         // Start is called before the first frame update
    19.         void Start()
    20.         {
    21.            
    22.            
    23.         }
    24.  
    25.         // Update is called once per frame
    26.         void Update()
    27.         {
    28.             if (Cam == null)
    29.                 Cam = new WebCamTexture();
    30.             GetComponent<Renderer>().material.mainTexture = Cam;
    31.  
    32.             if (!Cam.isPlaying)
    33.                 Cam.Play();
    34.         }
    35.     }
    36. }
    37.  
    In the above Program, I was able to get live feed as texture in Unity through my webcam, but when I try it in ARCore on my phone, I see the 1st frame only like so:

    Screenshot_2020-08-09-16-38-37-350_com.Moving_Objects.Moving_Stuffs.jpg

    I want a live raw feed instead of 1st frame. Can anyone suggest a code?
    Thanks in advance.
     
  2. rich2020

    rich2020

    Joined:
    Aug 31, 2013
    Posts:
    26
  3. pradyumnp508

    pradyumnp508

    Joined:
    Aug 9, 2020
    Posts:
    15
    Thank you so much. I'll try it out.
     
  4. pradyumnp508

    pradyumnp508

    Joined:
    Aug 9, 2020
    Posts:
    15
    I tried looking for this But I am unable to implement it in my code. "Frame" shows as unknown namespace. I am using ARFOUNDATION and ARCORE 3.1.0 for unity's package manager.

    Also I tried looking at this documentation: https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@2.2/manual/cpu-camera-image.html

    But again cant implement it as I don't know the structure of the library.

    Can someone help me understand the implementation?

    Also I am using Visual Studio and writing C# scripts in unity, and not using Android Studios.
     
  5. pradyumnp508

    pradyumnp508

    Joined:
    Aug 9, 2020
    Posts:
    15
    Can You give me a syntax of how to use it? Does it return a matrix of the Raw RGB data?
    I tried using it, but could't get its syntax right. I have been looking ever since a syntax of just how to get the RAW Pixel DAT and store it in a matrix so that I can use it with Opencv