Search Unity

  1. Unity 2019.2 is now released.
    Dismiss Notice

How to get camera texture in ARFoundation?

Discussion in 'Handheld AR' started by kexar66, Aug 4, 2018.

  1. kexar66

    kexar66

    Joined:
    Feb 27, 2013
    Posts:
    35
    Hi,
    is there any example how to get camera texture in ARFoundation (not screenshot)?

    Thanx
     
  2. rxmarccall

    rxmarccall

    Joined:
    Oct 13, 2011
    Posts:
    301
  3. tdmowrer

    tdmowrer

    Unity Technologies

    Joined:
    Apr 21, 2017
    Posts:
    543
    Curious for future API planning: what are you trying to do with the texture exactly?
     
  4. kexar66

    kexar66

    Joined:
    Feb 27, 2013
    Posts:
    35
    I just want to save a camera image to a file (like taking photo from camera app), without any unity objects, just plain camera image.

    Still having troubles to do that. Is there any example code how to do that?

    Thank you.
     
  5. tdmowrer

    tdmowrer

    Unity Technologies

    Joined:
    Apr 21, 2017
    Posts:
    543
    Try this:
    Code (CSharp):
    1.  
    2. // Copy the camera background to a RenderTexture
    3. Graphics.Blit(null, renderTexture, m_ARCameraBackground.material);
    4.  
    5. // Copy the RenderTexture from GPU to CPU
    6. var activeRenderTexture = RenderTexture.active;
    7. RenderTexture.active = renderTexture;
    8. if (m_LastCameraTexture == null)
    9.     m_LastCameraTexture = new Texture2D(renderTexture.width, renderTexture.height, TextureFormat.RGB24, true);
    10. m_LastCameraTexture.ReadPixels(new Rect(0, 0, renderTexture.width, renderTexture.height), 0, 0);
    11. m_LastCameraTexture.Apply();
    12. RenderTexture.active = activeRenderTexture;
    13.  
    14. // Write to file
    15. var bytes = m_LastCameraTexture.EncodeToPNG();
    16. var path = Application.persistentDataPath + "/camera_texture.png";
    17. File.WriteAllBytes(path, bytes);
    18.  
     
    LouisHong and Blarp like this.
  6. michaelmilst

    michaelmilst

    Joined:
    Jul 2, 2018
    Posts:
    5

    Trying to do the same, and used your method tdmowrer, but looks like the ARCameraBackground's material has Unity objects in it.
     
  7. tdmowrer

    tdmowrer

    Unity Technologies

    Joined:
    Apr 21, 2017
    Posts:
    543
    Have you overridden the ARCameraBackground's material with a custom material? If not, can you explain what you mean by "Unity objects"? Could you post a screenshot?
     
  8. michaelmilst

    michaelmilst

    Joined:
    Jul 2, 2018
    Posts:
    5
    I haven't overridden the ARCameraBackground's material with a custom material. By "Unity Objects in it", I mean 3D virtual objects from my scene are shown in the image when I only want an image of what the device's camera sees.

    In this example image which I got from using the ARCameraBackground's material, you can see a cube from my scene.
     

    Attached Files:

  9. tdmowrer

    tdmowrer

    Unity Technologies

    Joined:
    Apr 21, 2017
    Posts:
    543
    michaelmilst likes this.
  10. michaelmilst

    michaelmilst

    Joined:
    Jul 2, 2018
    Posts:
    5
    Your example definitely works...
    I'll try and figure out the differences and post what I was doing wrong here.
     
  11. michaelmilst

    michaelmilst

    Joined:
    Jul 2, 2018
    Posts:
    5
    Turns out this was my problem:
    If I blit the ARCameraBackground.material to the RenderTexture in the same method as I ReadPixels() and Apply() to my Texture2D, the texture is completely black. So, I turned the method into a Coroutine and added a yield return new WaitForSeconds so that the texture wouldn't be black. If I put the yield after the RenderTexture.active = renderTexture, the image will have Unity GameObjects in it. If I put the yield before that, the image will only show what the camera sees.

    Also, I tested without setting RenderTexture.active = renderTexture, and just as you said, the image had Unity GameObjects in it, and I'm not sure why that happens. Why would the renderTexture not contain Unity GameObjects when it is set as the active render texture, but contain them when it is not?
     
    Last edited: Aug 19, 2018
  12. tdmowrer

    tdmowrer

    Unity Technologies

    Joined:
    Apr 21, 2017
    Posts:
    543
    Because ReadPixels reads pixels from the RenderTexture.active. If you do not set this (or yield a frame after setting it), then the active RenderTexture is probably the last frame that was rendered.

    The reason the resulting image is black is probably due to a setting in your RenderTexture. Make sure it doesn't have a depth buffer:
    2018-08-18 15_52_27-arfoundation-samples - Microsoft Visual Studio.png
     
    michaelmilst likes this.
  13. michaelmilst

    michaelmilst

    Joined:
    Jul 2, 2018
    Posts:
    5
    Makes sense, thanks for all the help!
     
  14. vrmYavuz

    vrmYavuz

    Joined:
    Mar 7, 2018
    Posts:
    5
    Is it posible to blit camera view to render texture even if camera renders to render texture?
     
  15. Kubic75

    Kubic75

    Joined:
    Jan 2, 2017
    Posts:
    83
  16. tdmowrer

    tdmowrer

    Unity Technologies

    Joined:
    Apr 21, 2017
    Posts:
    543
  17. Kubic75

    Kubic75

    Joined:
    Jan 2, 2017
    Posts:
    83
    Thanks for pointing me to the example. Works pretty well.
     
  18. MarcoElz

    MarcoElz

    Joined:
    Jan 28, 2017
    Posts:
    9
    UPDATE:
    I finally got reflections on AR Foundation. The problem was that Real Time Reflections was disabled in the default settings for Android on ProjectSettings / Quality.

    ----

    Hello, I want to get de Camera to a RenderTexture to use as a reflection.
    Thanks to tdmowrer I got to the RenderTexture part, but I can not make work the reflection.

    What I want to do is have the same behaviour as this repo (https://github.com/johnsietsma/ARCameraLighting) but instead of ARCore with AR Foundation.

    I try using the built-in Skybox/Panoramic and the Skybox/ARSkybox from that repo. But I got a black reflection on the mobile device.

    On the Editor I tried with another image with both shaders, and both skybox and my reflective sphere got the texture correct. But I cannot make it work in the mobile with the RenderTexture.

    I use a Canvas to show the RenderTexture and works perfect, so I capture the camera correctly on the RenderTexture.
    And with a common skybox my reflective sphere works as expected.

    Any idea?

    Thank you.
     
    Last edited: Jan 25, 2019
    Sebastian_Trick3d likes this.
  19. christougher

    christougher

    Joined:
    Mar 6, 2015
    Posts:
    414
    @MarcoElz, care to share how you were able to convert arcameralighting to work with ARFoundation? I've gotten it working on android but I'm having a hard time with ios.

     
    Sebastian_Trick3d likes this.
  20. Deform-Studio

    Deform-Studio

    Joined:
    Oct 21, 2014
    Posts:
    2
    Same here.
     
  21. Sebastian_Trick3d

    Sebastian_Trick3d

    Joined:
    Dec 6, 2018
    Posts:
    2
    Check this out ma peeps! (I hope this helps, this was a challenge for us!)
    Included is a script and a shader; from the blitz project form above and the ARCameraLighting project
    Using 2018.3.8f1

    C# Code from @tdmowrer & from repo (https://github.com/johnsietsma/ARCameraLighting)

    Code (CSharp):
    1.  
    2. using System;
    3. using UnityEngine;
    4. using UnityEngine.Rendering;
    5. using UnityEngine.XR.ARFoundation;
    6.  
    7. [RequireComponent(typeof(ReflectionProbe))]
    8. public class ARDeviceScreenReflections : MonoBehaviour
    9. {
    10.     [SerializeField]
    11.     private Camera aRCamera = null;
    12.     [SerializeField]
    13.     [Tooltip("The camera background which controls the camera image.")]
    14.     private ARCameraBackground aRCameraBackground = null;
    15.     [SerializeField]
    16.     private Material skyboxMaterial = null;
    17.     [SerializeField]
    18.     [Tooltip("The RenderTexture to blit the camera image to.")]
    19.     private RenderTexture renderTexture = null;
    20.  
    21.     public bool IsCapturing { get { return isCapturing; } }
    22.  
    23.     private Material pastSkyboxMaterial = null;
    24.     private CommandBuffer m_blitCommandBuffer = null;
    25.     private CommandBuffer m_releaseCommandBuffer = null;
    26.     private bool isCapturing;
    27.  
    28.     private void Awake()
    29.     {
    30.         //Get width and height on targetRenderTexture
    31.         int renderTextureWidth = renderTexture.width;
    32.         int renderTextureHeight = renderTexture.height;
    33.  
    34.         // Clean up any previous command buffer and events hooks
    35.         if (m_blitCommandBuffer != null)
    36.         {
    37.             aRCamera.RemoveCommandBuffer(CameraEvent.BeforeForwardOpaque, m_blitCommandBuffer);
    38.             aRCamera.RemoveCommandBuffer(CameraEvent.AfterSkybox, m_releaseCommandBuffer);
    39.         }
    40.  
    41.         // Create the blit command buffer
    42.         m_blitCommandBuffer = new CommandBuffer();
    43.         m_blitCommandBuffer.GetTemporaryRT(WORKING_RENDER_TEXTURE_ID, renderTextureWidth, renderTextureHeight, 0, FilterMode.Bilinear);
    44.         m_blitCommandBuffer.name = "Get ARBackground";
    45.  
    46.         //  arCamera.BlitCameraTexture(m_blitCommandBuffer, workingRenderTextureID);
    47.         m_blitCommandBuffer.Blit(null, WORKING_RENDER_TEXTURE_ID, RenderSettings.skybox);
    48.  
    49.         // Copy over to the target texture.
    50.         m_blitCommandBuffer.Blit(WORKING_RENDER_TEXTURE_ID, renderTexture);
    51.  
    52.         // Run the command buffer just before opaque rendering
    53.         aRCamera.AddCommandBuffer(CameraEvent.BeforeForwardOpaque, m_blitCommandBuffer);
    54.  
    55.         // Cleanup the temp render textures
    56.         m_releaseCommandBuffer = new CommandBuffer();
    57.         m_releaseCommandBuffer.name = "Release ARBackground";
    58.         m_releaseCommandBuffer.ReleaseTemporaryRT(WORKING_RENDER_TEXTURE_ID);
    59.         aRCamera.AddCommandBuffer(CameraEvent.AfterSkybox, m_releaseCommandBuffer);
    60.  
    61.         isCapturing = true;
    62.     }
    63.  
    64.     private void OnEnable()
    65.     {
    66.         pastSkyboxMaterial = RenderSettings.skybox;
    67.         RenderSettings.skybox = skyboxMaterial;
    68.         ARSubsystemManager.cameraFrameReceived += OnCameraFrameReceived;
    69.     }
    70.  
    71.     private void Update()
    72.     {
    73.         skyboxMaterial.SetMatrix(WORLD_TO_CAMERA_MATRIX_PROP_ID, aRCamera.worldToCameraMatrix);
    74.     }
    75.  
    76.     private void OnCameraFrameReceived(ARCameraFrameEventArgs eventArgs)
    77.     {
    78.         BlitToRenderTexture(renderTexture, aRCameraBackground);
    79.     }
    80.  
    81.     private void OnDisable()
    82.     {
    83.         RenderSettings.skybox = pastSkyboxMaterial;
    84.         ARSubsystemManager.cameraFrameReceived -= OnCameraFrameReceived;
    85.         // Clean up any previous command buffer and events hooks
    86.         if (m_blitCommandBuffer != null && aRCamera != null)
    87.         {
    88.             aRCamera.RemoveCommandBuffer(CameraEvent.BeforeForwardOpaque, m_blitCommandBuffer);
    89.             aRCamera.RemoveCommandBuffer(CameraEvent.AfterSkybox, m_releaseCommandBuffer);
    90.         }
    91.         isCapturing = false;
    92.     }
    93.  
    94.     public static void BlitToRenderTexture(RenderTexture renderTexture, ARCameraBackground cameraBackground)
    95.     {
    96.         if (renderTexture == null)
    97.             throw new ArgumentNullException("renderTexture");
    98.  
    99.         if (cameraBackground == null)
    100.             throw new ArgumentNullException("cameraBackground");
    101.  
    102.         // Copy the camera background to a RenderTexture
    103.         Graphics.Blit(null, renderTexture, cameraBackground.material);
    104.     }
    105.  
    106.     private static readonly int WORLD_TO_CAMERA_MATRIX_PROP_ID = Shader.PropertyToID("_WorldToCameraMatrix");
    107.     private static readonly int WORKING_RENDER_TEXTURE_ID = Shader.PropertyToID("_ARCameraRenderTexture");
    108. }
    109.  
    Shader for repo (https://github.com/johnsietsma/ARCameraLighting)

    Code (Shader):
    1.  
    2. Shader "AR/ARSkybox"
    3. {
    4.     Properties
    5.     {
    6.         _LightingTex("Render Texture", 2D) = "white" {}
    7.     }
    8.  
    9.     CGINCLUDE
    10.  
    11.     #include "UnityCG.cginc"
    12.  
    13.     struct appdata
    14.     {
    15.         float4 position : POSITION;
    16.         float3 normal : NORMAL;
    17.         float3 texcoord : TEXCOORD0;
    18.     };
    19.  
    20.     struct v2f
    21.     {
    22.         float4 position : SV_POSITION;
    23.         float2 texcoord : TEXCOORD0;
    24.     };
    25.  
    26.     // This relies on a RenderTexture of this name being created in ARCoreCameraRenderTexture.cs.
    27.     sampler2D _LightingTex;
    28.     float4x4 _WorldToCameraMatrix;
    29.  
    30.     float2 SphereMapUVCoords( float3 viewDir, float3 normal )
    31.     {
    32.         // Sphere mapping. Find reflection and tranform into UV coords.
    33.         // Heavily inspired by https://www.clicktorelease.com/blog/creating-spherical-environment-mapping-shader/
    34.         float3 reflection = reflect(viewDir, normal);
    35.         float m = 2. * sqrt(
    36.             pow(reflection.x, 2.) +
    37.             pow(reflection.y, 2.) +
    38.             pow(reflection.z + 1., 2.)
    39.         );
    40.         return reflection.xy / m + .5;
    41.     }
    42.  
    43.     v2f vert(appdata v)
    44.     {
    45.         // Create a sphere map with a texture whose center is at the viewDir/sphere intersection.
    46.         // The texture is wrapped around the sphere so that the corners meet directly behind the camera.
    47.         // To do this we could operate in static viewDir (0,0,1) space. We always want to look at the center on the texture.
    48.         // When we move the phone, there is no need to change the view direction.
    49.         // When rendering a skybox, the view direction is altered for each face. Grab the world space view direction to each vert
    50.         //  then reverse the camera's view direction, bringing it back to view space.
    51.         float3 viewDir = -normalize(WorldSpaceViewDir(v.position));
    52.         viewDir = mul(_WorldToCameraMatrix, float4(viewDir,0));
    53.  
    54.         v2f o;
    55.         o.position = UnityObjectToClipPos(v.position);
    56.         o.texcoord = SphereMapUVCoords(viewDir, v.normal);
    57.  
    58.         return o;
    59.     }
    60.  
    61.     fixed4 frag(v2f i) : COLOR
    62.     {
    63.         return tex2D(_LightingTex, i.texcoord);
    64.     }
    65.  
    66.     ENDCG
    67.  
    68.     SubShader
    69.     {
    70.         Tags{ "RenderType" = "Background" "Queue" = "Background" }
    71.             Pass
    72.         {
    73.             ZWrite Off
    74.             Cull Off
    75.             Fog{ Mode Off }
    76.             CGPROGRAM
    77. #pragma fragmentoption ARB_precision_hint_fastest
    78. #pragma vertex vert
    79. #pragma fragment frag
    80.             ENDCG
    81.         }
    82.     }
    83. }
    84.  
     
    christougher likes this.
  22. christougher

    christougher

    Joined:
    Mar 6, 2015
    Posts:
    414
    Thx so much for sharing! :D I"m having trouble getting it to work though... :( Does this require anything else from the ARCameraLighting project? I've created and assigned the necessary components onto ARDeviceScreenReflections on a Reflection Probe. I allowed realtime reflections in settings... I'm stumped...

    The only way I've gotten it to work is to set the camera to target the CameraRenderTexture and have that assigned to a Raw Image.... not ideal. Could you possibly email me a basic scene file with it setup? christougher@hotmail.com. I'd be happy to detail the instructions here for anyone else. Again thanks so much for sharing!


     
  23. Sebastian_Trick3d

    Sebastian_Trick3d

    Joined:
    Dec 6, 2018
    Posts:
    2
    Okay, I'm not sure but for us, this worked only after fiddling with the reflection probes--->
    Make sure you have enabled reflection probes in the settings, make sure a large enough reflection probe is in the scene, the meshes have the blend probe option, and that they are capturing at an equal resolution to your render texture (width and height both), the reflection probe is realtime, and the same reflection probe is refreshing every frame. We tested on Unity version 2018.3.8f1 on iPhone8.

    Besides that, check that the environment reflections are also the same resolution of all the other things, AND that the environment reflections source is set to skybox.
     
    Last edited: Apr 15, 2019
    christougher likes this.
  24. christougher

    christougher

    Joined:
    Mar 6, 2015
    Posts:
    414
    Thanks so much, I'll check it out!
     
  25. christougher

    christougher

    Joined:
    Mar 6, 2015
    Posts:
    414
    Working!!!! Got it going on Android on my Pixel 2. I couldn't figure out what was the problem was with my scene. Everything was still showing up black. I finally saw that your rendertexture was set to No Depth Buffer. After making that change everything worked as it should.
     
    Last edited: Apr 16, 2019
    Blarp and Sebastian_Trick3d like this.
  26. AlCampbell

    AlCampbell

    Joined:
    Jan 8, 2018
    Posts:
    4
    Quick question on this, are CameraConfigurations compatible with using XRCameraSubsystem.TryGetLatestImage to get the raw image from the CPU? For some reason, trying to set the CameraConfiguration is preventing us from getting an image where we otherwise have no problem.
     
  27. Baraneedharan

    Baraneedharan

    Joined:
    Jan 10, 2019
    Posts:
    4
    Hi tdmowrer,

    I am new to ARFoundation, my requirements is to position the 3d content on both detected Vertical and Horizontal QR code pose. In ARkit i am using "ARTextureHandles handles = arSession.GetARVideoTextureHandles();" to get values. please advise on the syntax to be used in ARFoundation for this scenario .
     
  28. JM_CG

    JM_CG

    Joined:
    Apr 20, 2017
    Posts:
    9
    Is blitting the CameraTexture working in 2019 with AR Foundation/has worked for anyone else? I seem to get errors on iOS.
     
  29. eco_bach

    eco_bach

    Joined:
    Jul 8, 2013
    Posts:
    1,357
  30. tdmowrer

    tdmowrer

    Unity Technologies

    Joined:
    Apr 21, 2017
    Posts:
    543
    Sebastian_Trick3d likes this.
  31. eco_bach

    eco_bach

    Joined:
    Jul 8, 2013
    Posts:
    1,357
    Thanks!
    So I am successfully grabbing the ARCamerabackground with Graphics.Blit but how do do this without actually rendering the source ARCamerabackground?
    ie I only want to see the Rendertexture I am targeting.

    Is it necessary to modify the ARCameraBackground component or is there another way?
     
    Last edited: Jun 17, 2019
  32. austintaylorx

    austintaylorx

    Joined:
    Aug 13, 2018
    Posts:
    8
    Funny enough, My issue is the exact opposite, I actually want the unity objects in my picture, but they're not showing up.

    Code (CSharp):
    1. using UnityEngine;
    2. using UnityEngine.XR.ARFoundation;
    3. using System;
    4. using Infrastructure.CoroutineRunner;
    5.  
    6. namespace Infrastructure.CCSystem
    7. {
    8.     public class CameraCaptureSystem : ICameraCaptureSystem
    9.     {
    10.         public static Texture2D m_Texture;
    11.  
    12.         private RenderTexture renderTexture;
    13.         private Texture2D lastCameraTexture;
    14.  
    15.         private ARCameraBackground aRCameraBackground;
    16.         private ICoroutineRunner coroutineRunner;
    17.  
    18.         public CameraCaptureSystem(
    19.             ARCameraBackground aRCameraBackground,
    20.             RenderTexture renderTexture,
    21.             ICoroutineRunner coroutineRunner)
    22.         {
    23.             this.aRCameraBackground = aRCameraBackground;
    24.             this.renderTexture = renderTexture;
    25.             this.coroutineRunner = coroutineRunner;
    26.  
    27.             RenderTexture.active = renderTexture;
    28.         }
    29.  
    30.         public void CapturePhoto()
    31.         {
    32.             Graphics.Blit(null, renderTexture, aRCameraBackground.material);
    33.  
    34.             var activeRenderTexture = RenderTexture.active;
    35.             RenderTexture.active = renderTexture;
    36.             if (lastCameraTexture == null)
    37.                 lastCameraTexture = new Texture2D(renderTexture.width, renderTexture.height, TextureFormat.RGB24, true);
    38.             lastCameraTexture.ReadPixels(new Rect(0, 0, renderTexture.width, renderTexture.height), 0, 0);
    39.             lastCameraTexture.Apply();
    40.             RenderTexture.active = activeRenderTexture;
    41.  
    42.             m_Texture = lastCameraTexture;
    43.         }
    44.     }
    45. }
    46.  
    47.  
    48.  
    I'm in Unity 2019.1.6f1
    AR foundation 2.2.0
    AR core 2.1.0

    How can I get objects I spawn in ar to show up in a picture?
     
  33. sameel

    sameel

    Joined:
    Dec 4, 2015
    Posts:
    9
    I am getting a black screen instead, Here is what I am trying

    Code (CSharp):
    1. Texture2D cameraTexture = new Texture2D( (int)width, (int) height, TextureFormat.RGB24, false);
    2.         RenderTexture rt = new RenderTexture((int)width, (int)height, 24, RenderTextureFormat.ARGB32);
    3.         RenderTexture.active = rt;
    4.  
    5.         arCamera.targetTexture = rt;
    6.         arCamera.Render();
    7. cameraTexture.ReadPixels(new Rect(0, 0, width, height), 0, 0);
    8.         cameraTexture.Apply();
    9.  
    10.         RenderTexture.active = null;
    11.         RenderTexture.ReleaseTemporary(rt);
    12.         Destroy(rt);
    13. File.WriteAllBytes(screenShotPath, cameraTexture.EncodeToPNG());
     
  34. tdmowrer

    tdmowrer

    Unity Technologies

    Joined:
    Apr 21, 2017
    Posts:
    543
    Seems like it should work. The docs suggest using Graphics.Blit to achieve a similar result. Also note that ReadPixels is very slow (~20 ms) and should be avoided if possible. There's a separate API for accessing the camera image on the CPU.
     
  35. x2stone

    x2stone

    Joined:
    Dec 17, 2014
    Posts:
    11
    Hi tdmowrer,
    I don't understand well, which is the most efficient method to get the camera image on iOS ? The one you wrote here (which use ReadPixels function that you say it should be avoid) or the "CPU" method that you links in your last post ?
    I just want to get the iPhone camera feed (so, nothing more than that the camera sees) and encode it to MP4 (while seeing the AR stuff of my application on my iPhone)
    Thanks for your explanation.
     
  36. sameel

    sameel

    Joined:
    Dec 4, 2015
    Posts:
    9
    Hi @tdmowrer, I tried using Graphics.Blit and there is not AR Object in the image that gets saved.
     
  37. tdmowrer

    tdmowrer

    Unity Technologies

    Joined:
    Apr 21, 2017
    Posts:
    543
    It sounds like you need the camera image on the CPU, so you should use the API I linked for accessing the camera image on the CPU.