Search Unity

  1. Unity 2019.1 is now released.
    Dismiss Notice

UnityEngine.XR.ARBackgroundRenderer don't render texture image from ARCore CameraImage.

Discussion in 'ARCore' started by unity_tsr, Jan 28, 2019.

  1. unity_tsr

    unity_tsr

    Joined:
    Jul 4, 2018
    Posts:
    2
    I develop Android application with ARCore.

    I got the Texture2D from camera by using TextureReader.cs in the ComputerVision example.
    Then, I try to render the texture to background with UnityEngine.XR.ARBackgroundRenderer, but couldn't.

    I use Unity2018.3.2f1 & ARCore1.6.0 & Huawei P20 for the smartphone.

    I have no error, but black screen.

    Why?
    Please help me !

    Code (CSharp):
    1.  
    2. Texture2D CameraImage;
    3.  
    4. backgroundRenderer.backgroundTexture = CameraImage;
    5. backgroundRenderer.camera = Camera.main;
    6. backgroundRenderer.mode = ARRenderMode.MaterialAsBackground;
    7.  
     
    Last edited: Jan 28, 2019
  2. tdmowrer

    tdmowrer

    Unity Technologies

    Joined:
    Apr 21, 2017
    Posts:
    389
    That's a very inefficient way to render the background, as you are taking the CPU camera image and then writing it back to the GPU. Can you provide more details on what you are trying to accomplish?

    As to why this doesn't work, have you applied the background texture?
     
  3. unity_tsr

    unity_tsr

    Joined:
    Jul 4, 2018
    Posts:
    2
    Firstly, I used ARCoreBackgroundRenderer in ARcore SDK for Unity.
    But, I estimate that unity's screen coordinate size and rendered background image size are different.
    So, I survey SDK code, and find the system render CPU image by using shader forcibly.
    Then, I try to render backgroundImage by getting Camera image data and generating texture2D.
    When I got this idea from these(https://github.com/google-ar/arcore-unity-sdk/issues/221).

    To confirm to get texture2D corectly, I displayed texture2D on the screen as the Raw image.
    Then, successed.
    But, in this case, raw image is rendered before any 3D object.
    So, I can't develop AR application.

    Then, I try to use ARBackgroundRenderer.
    This is a part of my code.
    Code (CSharp):
    1. private void Awake()
    2.     {
    3.         textureReader = GetComponent<TextureReader>();
    4.         textureReader.OnImageAvailableCallback += OnImageAvailableCallbackFunk;
    5.  
    6.         if (backgroundRenderer == null)
    7.         {
    8.             backgroundRenderer = new ARBackgroundRenderer();
    9.         }
    10.     }
    11.  
    12.     private void OnEnable()
    13.     {
    14.         m_Camera = GetComponent<Camera>();
    15.         backgroundRenderer.backgroundTexture = CameraImage;
    16.         backgroundRenderer.camera = m_Camera;
    17.         backgroundRenderer.mode = ARRenderMode.MaterialAsBackground;
    18.     }
    19.  
    20.     // Use this for initialization
    21.     void Start()
    22.     {
    23.         CameraImage = null;
    24.         screenImage = RawImageObj.GetComponent<RawImage>();
    25.     }
    26.  
    27.     void ImageToScreen()
    28.     {
    29.         //Create texture like #221
    30.         _tex.Apply();
    31.  
    32.         CameraImage = _tex;
    33.     }
    34.  
    35.     // Update is called once per frame
    36.     void Update()
    37.     {
    38.         if (CameraImage != null)
    39.         {
    40.             screenImage = RawImageObj.GetComponent<RawImage>();
    41.             screenImage.rectTransform.sizeDelta = new Vector2(CameraImage.width, CameraImage.height);
    42.             screenImage.texture = CameraImage;
    43.  
    44.             Resources.UnloadUnusedAssets();
    45.         }
    46.     }