Search Unity

  1. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Resolved How to get camera texture in ARFoundation?

Discussion in 'AR' started by kexar66, Aug 4, 2018.

  1. gnp89

    gnp89

    Joined:
    Jun 25, 2012
    Posts:
    36
    The AR camera background texture should be already in the GPU, is there a way I can access that from some other shader? Maybe set it as a global texture for all shaders?
     
  2. tdmowrer

    tdmowrer

    Joined:
    Apr 21, 2017
    Posts:
    605
  3. gnp89

    gnp89

    Joined:
    Jun 25, 2012
    Posts:
    36
    Cool thanks!
     
  4. gnp89

    gnp89

    Joined:
    Jun 25, 2012
    Posts:
    36
    I tried this event and I get the property IDs, but I don't have a way to get the property names. I get 2 texture IDs, but how am I supposed to know which are those textures and what do they mean? And also I don't know what can I expect when I change platforms or anything, will I get more textures maybe?
    I still believe there should be an easy way to get the camera output texture living in GPU memory. I want to texture an object using the camera background, and I would like to just set that texture to a material. Any suggestions?

    Edit: I just tried setting those 2 textures to another material and I get either a red or yellow tinted textures, but not the AR camera output.
     
    Last edited: Nov 26, 2020
  5. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,121
    Different platforms implement camera video texture differently. For example, ARKit uses YCbCr format that uses two textures.
    To take the video texture directly from GPU without accessing camera on CPU, you should write your own custom shader that uses the same texture names. In the case of ARKit, your shader should use these textures (please refer to ARKitBackground.shader for more info):
    Code (CSharp):
    1. Properties
    2.     {
    3.         _textureY ("TextureY", 2D) = "white" {}
    4.         _textureCbCr ("TextureCbCr", 2D) = "black" {}
    5.     }
    Then, you can pass textures received from ARCameraManager to your shader with the help of Material.SetTexture(int, Texture)
     
    orangetech likes this.
  6. gnp89

    gnp89

    Joined:
    Jun 25, 2012
    Posts:
    36
    I see, I thought I would be able to get a normal ARGB texture. We'll go with the TryGetLastTexture method because we don't have time now to dig into that and platform specific code and shaders, but I'll definitely look into that later.
    For the record, what would be the property name for the texture/s in Android?
    Thanks for your answer!
     
    Last edited: Nov 30, 2020
  7. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,121
    ARCore uses just one video texture named _MainTex (the default texture name). But ARCore has another specific, it uses external texture extension in its ARCoreBackground.shader. In my tests, this prevents regular shaders to access the camera texture. So you still have to write a custom shader for ARCore if you want to display camera texture without copying it to CPU.
     
    gnp89 likes this.
  8. mtellezleon46

    mtellezleon46

    Joined:
    Dec 9, 2020
    Posts:
    4
    Hi!!
    Currently I´m using the CPU Image from the AR Camera of ARFoundation and I´m aplying it to a Raw Texture.
    I´ve made a different Scene with only the ARCamera from AR Foundation and in this Scene I´m instantiating an Sphere with X,Y,Z coordinates with Camera.main.ScreenToWorldPoint(WorldPos); and WorldPos has asignated a Vector3. This is working as expected.
    Now I´m trying to use this simple code but using the CPU Image (using the method TryAquireLatestCPUImage), I´m making sure everything it´s been assigned both the prefab that I want to instantiate and the scripts I refer to. I´m also using a try catch block, to see I´f something throw and exception but what I´m still getting as message its : "object reference not set to an instance of an object". Does anyone know If I can use instace of an object while I´m using CPU Image?. Thank you so mucho! :D
     
  9. mtellezleon46

    mtellezleon46

    Joined:
    Dec 9, 2020
    Posts:
    4

    I´ve already solved it!
    It wasn´t anything to do with code. I just wasn´t calling properly the camera. Thank you!! :)
     
  10. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,449
    I just used this to grab the camera texture and ping it off into a rendertexture of my own size chosen, simply wanted the background image to do some very specific UI overlay effects that I couldn't do with UI masks or post-processing due to project design choice reasons. Very effective if you are making a colouring book or something where you just want the camera image as- is on screen to use in UI or textures.

    I found I needed to have 16-bit stencil depth at least on the render texture for it to render fine on IOS when testing with ARF

    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4. using UnityEngine.XR.ARFoundation;
    5.  
    6. public class Test1 : MonoBehaviour
    7. {
    8.     public ARCameraBackground m_ARCameraBackground;
    9.     public RenderTexture targetRenderTexture;
    10.  
    11.     // Update is called once per frame
    12.     void Update()
    13.     {
    14.         Graphics.Blit(null, targetRenderTexture, m_ARCameraBackground.material);
    15.     }
    16. }
    17.  
     
    Last edited: Apr 13, 2021
    orangetech and KyryloKuzyk like this.
  11. marck_ozz

    marck_ozz

    Joined:
    Nov 30, 2018
    Posts:
    107
    hello @ROBYER1!!

    I tried to do the same but I get a NullReference error for the background material, according to the documentation of the ARFoundation, "The Custom Material property is optional, and typically you do not need to set it. The platform-specific packages provided by Unity (e.g., ARCore and ARKit) provide their own shaders for background rendering."

    are you using a custom material for the background?

    Saludos
     
    ROBYER1 likes this.
  12. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,449
    Ended up not needing to use that, sew my example above your post for what I was doing, we would take that texture and put it into a shader graph
     
    marck_ozz likes this.
  13. marck_ozz

    marck_ozz

    Joined:
    Nov 30, 2018
    Posts:
    107
    and does it ever worked?

    I'm trying to put the texture on a RawImage just for visualization, like a miniature view, like this:

    Code (CSharp):
    1. using UnityEngine;
    2. using UnityEngine.UI;
    3. using UnityEngine.XR.ARFoundation;
    4.  
    5. public class SketchDownload : MonoBehaviour
    6. {
    7.     public ARCameraBackground m_ARCameraBackground;
    8.     private RenderTexture targetRenderTexture;
    9.     public RawImage ARTexture;
    10.  
    11.     void Update()
    12.     {
    13.         Graphics.Blit(null, targetRenderTexture, m_ARCameraBackground.material);
    14.         ARTexture.texture = targetRenderTexture;
    15.     }
    16. }
    17.  
    but I get "
    ArgumentNullException: Value cannot be null.
    Parameter name: mat"

    I have my the "AR Camera Background" component in my camera that is under my "AR Session Origin" object.

    Thanks for your response.
     
  14. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,449
    Copy my code exactly, make sure the rendertexture you are pinging it to exists in your assets folder somewhere, it writes directly to that texture.
    Just before you enter play mode, in the editor assign that render texture to a rawinage component, any changes to that rendertexture will show there too I believe. You need that rendertexture to be public so you can assign the rendertexture itself from your project assets folder to it on the component
     
  15. marck_ozz

    marck_ozz

    Joined:
    Nov 30, 2018
    Posts:
    107
    Thanks a lot @ROBYER1!!

    I did this as you say.

    Also changed the code like the example that @tdmowrer provided:

    Code (CSharp):
    1. using UnityEngine;
    2. using UnityEngine.UI;
    3. using UnityEngine.XR.ARFoundation;
    4. public class SketchDownload : MonoBehaviour
    5. {
    6.     public ARCameraBackground m_ARCameraBackground;
    7.     public RenderTexture targetRenderTexture;
    8.     public RawImage ARTexture;
    9.     private Texture2D m_LastCameraTexture;
    10.  
    11.     void Update()
    12.     {
    13.         Graphics.Blit(null, targetRenderTexture, m_ARCameraBackground.material);
    14.      
    15.         var activeRenderTexture = RenderTexture.active;
    16.         RenderTexture.active = targetRenderTexture;
    17.         if (m_LastCameraTexture == null)
    18.             m_LastCameraTexture = new Texture2D(targetRenderTexture.width, targetRenderTexture.height, TextureFormat.RGB24, true);
    19.         m_LastCameraTexture.ReadPixels(new Rect(0, 0, targetRenderTexture.width, targetRenderTexture.height), 0, 0);
    20.         m_LastCameraTexture.Apply();
    21.         RenderTexture.active = activeRenderTexture;
    22.  
    23.  
    24.         ARTexture.texture = m_LastCameraTexture;
    25.     }
    26. }
    27.  
    I'm using:
    Unity 2020.3.3
    ARFoundation sample project
    Testes on Galaxy Note9 - Android 10.



    The result:
    upload_2021-6-22_17-51-47.jpeg

    The size ratio of the miniature view is not the same of the screen size but that's easy to fix, just wanna share the result.

    EDIT:

    it works just with:

    Code (CSharp):
    1.  Graphics.Blit(null, targetRenderTexture, m_ARCameraBackground.material);
    2.        
    3.         ARTexture.texture = targetRenderTexture;
     
    Last edited: Jun 23, 2021
    ROBYER1 likes this.
  16. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,449
    Nice one!
     
    marck_ozz likes this.
  17. Zulks

    Zulks

    Joined:
    Oct 31, 2022
    Posts:
    2
  18. Thokr

    Thokr

    Joined:
    May 27, 2016
    Posts:
    3
    EDIT: I found that the Graphics.Blit solution does actually give me a valid camera texture, but for some reason when I'm feeding it to my shader it becomes corrupted. So, further investigation is need on my part.

    ---

    I apologize for bumping an old thread, but I'm still having trouble getting camera texture on iOS. My goal is to use the camera texture in another shader much like @gnp89 wanted. Here's what I tried:

    1. Using Graphics.Blit with a RenderTexture and ARCameraBackground material:

    Code (CSharp):
    1. Graphics.Blit(null, targetRenderTexture, m_ARCameraBackground.material);
    This works great for Android, but unfortunately on iOS I just get a blinking texture, like it's feeding some junk data to the texture. I didn't override the material.

    2. Getting textures from ARCameraFrameEventArgs.textures from the ARCameraManager.frameReceived event. The texture is just black, and the app crashes shortly after. I didn't try to convert it from YcbCr to ARGB yet, but I imagine I should see at least something sensible, even though colors wouldn't be quite right.

    Any ideas how to do this properly? Basically just render camera feed into a RenderTexture, that's all I want. I'm using bulit-it render pipeline if that's relevant. My Unity version is 2021.3.8f1 and ARFoundation version is 5.0.2.
     
    Last edited: Jan 20, 2023
  19. andyb-unity

    andyb-unity

    Unity Technologies

    Joined:
    Feb 10, 2022
    Posts:
    956
    ROBYER1, KyryloKuzyk and Thokr like this.