Search Unity

Resolved How to get camera texture in ARFoundation?

Discussion in 'AR' started by kexar66, Aug 4, 2018.

  1. kexar66

    kexar66

    Joined:
    Feb 27, 2013
    Posts:
    48
    Hi,
    is there any example how to get camera texture in ARFoundation (not screenshot)?

    Thanx
     
  2. rxmarccall

    rxmarccall

    Joined:
    Oct 13, 2011
    Posts:
    353
  3. tdmowrer

    tdmowrer

    Joined:
    Apr 21, 2017
    Posts:
    605
    Curious for future API planning: what are you trying to do with the texture exactly?
     
  4. kexar66

    kexar66

    Joined:
    Feb 27, 2013
    Posts:
    48
    I just want to save a camera image to a file (like taking photo from camera app), without any unity objects, just plain camera image.

    Still having troubles to do that. Is there any example code how to do that?

    Thank you.
     
  5. tdmowrer

    tdmowrer

    Joined:
    Apr 21, 2017
    Posts:
    605
    Try this:
    Code (CSharp):
    1.  
    2. // Copy the camera background to a RenderTexture
    3. Graphics.Blit(null, renderTexture, m_ARCameraBackground.material);
    4.  
    5. // Copy the RenderTexture from GPU to CPU
    6. var activeRenderTexture = RenderTexture.active;
    7. RenderTexture.active = renderTexture;
    8. if (m_LastCameraTexture == null)
    9.     m_LastCameraTexture = new Texture2D(renderTexture.width, renderTexture.height, TextureFormat.RGB24, true);
    10. m_LastCameraTexture.ReadPixels(new Rect(0, 0, renderTexture.width, renderTexture.height), 0, 0);
    11. m_LastCameraTexture.Apply();
    12. RenderTexture.active = activeRenderTexture;
    13.  
    14. // Write to file
    15. var bytes = m_LastCameraTexture.EncodeToPNG();
    16. var path = Application.persistentDataPath + "/camera_texture.png";
    17. File.WriteAllBytes(path, bytes);
    18.  
     
  6. michaelmilst

    michaelmilst

    Joined:
    Jul 2, 2018
    Posts:
    5

    Trying to do the same, and used your method tdmowrer, but looks like the ARCameraBackground's material has Unity objects in it.
     
  7. tdmowrer

    tdmowrer

    Joined:
    Apr 21, 2017
    Posts:
    605
    Have you overridden the ARCameraBackground's material with a custom material? If not, can you explain what you mean by "Unity objects"? Could you post a screenshot?
     
    mk99park likes this.
  8. michaelmilst

    michaelmilst

    Joined:
    Jul 2, 2018
    Posts:
    5
    I haven't overridden the ARCameraBackground's material with a custom material. By "Unity Objects in it", I mean 3D virtual objects from my scene are shown in the image when I only want an image of what the device's camera sees.

    In this example image which I got from using the ARCameraBackground's material, you can see a cube from my scene.
     

    Attached Files:

    Cudo_Service likes this.
  9. tdmowrer

    tdmowrer

    Joined:
    Apr 21, 2017
    Posts:
    605
    michaelmilst likes this.
  10. michaelmilst

    michaelmilst

    Joined:
    Jul 2, 2018
    Posts:
    5
    Your example definitely works...
    I'll try and figure out the differences and post what I was doing wrong here.
     
  11. michaelmilst

    michaelmilst

    Joined:
    Jul 2, 2018
    Posts:
    5
    Turns out this was my problem:
    If I blit the ARCameraBackground.material to the RenderTexture in the same method as I ReadPixels() and Apply() to my Texture2D, the texture is completely black. So, I turned the method into a Coroutine and added a yield return new WaitForSeconds so that the texture wouldn't be black. If I put the yield after the RenderTexture.active = renderTexture, the image will have Unity GameObjects in it. If I put the yield before that, the image will only show what the camera sees.

    Also, I tested without setting RenderTexture.active = renderTexture, and just as you said, the image had Unity GameObjects in it, and I'm not sure why that happens. Why would the renderTexture not contain Unity GameObjects when it is set as the active render texture, but contain them when it is not?
     
    Last edited: Aug 19, 2018
  12. tdmowrer

    tdmowrer

    Joined:
    Apr 21, 2017
    Posts:
    605
    Because ReadPixels reads pixels from the RenderTexture.active. If you do not set this (or yield a frame after setting it), then the active RenderTexture is probably the last frame that was rendered.

    The reason the resulting image is black is probably due to a setting in your RenderTexture. Make sure it doesn't have a depth buffer:
    2018-08-18 15_52_27-arfoundation-samples - Microsoft Visual Studio.png
     
    michaelmilst likes this.
  13. michaelmilst

    michaelmilst

    Joined:
    Jul 2, 2018
    Posts:
    5
    Makes sense, thanks for all the help!
     
  14. vrmYavuz

    vrmYavuz

    Joined:
    Mar 7, 2018
    Posts:
    5
    Is it posible to blit camera view to render texture even if camera renders to render texture?
     
  15. Kubic75

    Kubic75

    Joined:
    Jan 2, 2017
    Posts:
    83
  16. tdmowrer

    tdmowrer

    Joined:
    Apr 21, 2017
    Posts:
    605
  17. Kubic75

    Kubic75

    Joined:
    Jan 2, 2017
    Posts:
    83
    Thanks for pointing me to the example. Works pretty well.
     
  18. MarcoElz

    MarcoElz

    Joined:
    Jan 28, 2017
    Posts:
    17
    UPDATE:
    I finally got reflections on AR Foundation. The problem was that Real Time Reflections was disabled in the default settings for Android on ProjectSettings / Quality.

    ----

    Hello, I want to get de Camera to a RenderTexture to use as a reflection.
    Thanks to tdmowrer I got to the RenderTexture part, but I can not make work the reflection.

    What I want to do is have the same behaviour as this repo (https://github.com/johnsietsma/ARCameraLighting) but instead of ARCore with AR Foundation.

    I try using the built-in Skybox/Panoramic and the Skybox/ARSkybox from that repo. But I got a black reflection on the mobile device.

    On the Editor I tried with another image with both shaders, and both skybox and my reflective sphere got the texture correct. But I cannot make it work in the mobile with the RenderTexture.

    I use a Canvas to show the RenderTexture and works perfect, so I capture the camera correctly on the RenderTexture.
    And with a common skybox my reflective sphere works as expected.

    Any idea?

    Thank you.
     
    Last edited: Jan 25, 2019
    Sebastian_Trick3d likes this.
  19. christougher

    christougher

    Joined:
    Mar 6, 2015
    Posts:
    558
    @MarcoElz, care to share how you were able to convert arcameralighting to work with ARFoundation? I've gotten it working on android but I'm having a hard time with ios.

     
    Sebastian_Trick3d likes this.
  20. IN_Capital_studios

    IN_Capital_studios

    Joined:
    Oct 21, 2014
    Posts:
    2
    Same here.
     
  21. Sebastian_Trick3d

    Sebastian_Trick3d

    Joined:
    Dec 6, 2018
    Posts:
    3
    Check this out ma peeps! (I hope this helps, this was a challenge for us!)
    Included is a script and a shader; from the blitz project form above and the ARCameraLighting project
    Using 2018.3.8f1

    C# Code from @tdmowrer & from repo (https://github.com/johnsietsma/ARCameraLighting)

    Code (CSharp):
    1.  
    2. using System;
    3. using UnityEngine;
    4. using UnityEngine.Rendering;
    5. using UnityEngine.XR.ARFoundation;
    6.  
    7. [RequireComponent(typeof(ReflectionProbe))]
    8. public class ARDeviceScreenReflections : MonoBehaviour
    9. {
    10.     [SerializeField]
    11.     private Camera aRCamera = null;
    12.     [SerializeField]
    13.     [Tooltip("The camera background which controls the camera image.")]
    14.     private ARCameraBackground aRCameraBackground = null;
    15.     [SerializeField]
    16.     private Material skyboxMaterial = null;
    17.     [SerializeField]
    18.     [Tooltip("The RenderTexture to blit the camera image to.")]
    19.     private RenderTexture renderTexture = null;
    20.  
    21.     public bool IsCapturing { get { return isCapturing; } }
    22.  
    23.     private Material pastSkyboxMaterial = null;
    24.     private CommandBuffer m_blitCommandBuffer = null;
    25.     private CommandBuffer m_releaseCommandBuffer = null;
    26.     private bool isCapturing;
    27.  
    28.     private void Awake()
    29.     {
    30.         //Get width and height on targetRenderTexture
    31.         int renderTextureWidth = renderTexture.width;
    32.         int renderTextureHeight = renderTexture.height;
    33.  
    34.         // Clean up any previous command buffer and events hooks
    35.         if (m_blitCommandBuffer != null)
    36.         {
    37.             aRCamera.RemoveCommandBuffer(CameraEvent.BeforeForwardOpaque, m_blitCommandBuffer);
    38.             aRCamera.RemoveCommandBuffer(CameraEvent.AfterSkybox, m_releaseCommandBuffer);
    39.         }
    40.  
    41.         // Create the blit command buffer
    42.         m_blitCommandBuffer = new CommandBuffer();
    43.         m_blitCommandBuffer.GetTemporaryRT(WORKING_RENDER_TEXTURE_ID, renderTextureWidth, renderTextureHeight, 0, FilterMode.Bilinear);
    44.         m_blitCommandBuffer.name = "Get ARBackground";
    45.  
    46.         //  arCamera.BlitCameraTexture(m_blitCommandBuffer, workingRenderTextureID);
    47.         m_blitCommandBuffer.Blit(null, WORKING_RENDER_TEXTURE_ID, RenderSettings.skybox);
    48.  
    49.         // Copy over to the target texture.
    50.         m_blitCommandBuffer.Blit(WORKING_RENDER_TEXTURE_ID, renderTexture);
    51.  
    52.         // Run the command buffer just before opaque rendering
    53.         aRCamera.AddCommandBuffer(CameraEvent.BeforeForwardOpaque, m_blitCommandBuffer);
    54.  
    55.         // Cleanup the temp render textures
    56.         m_releaseCommandBuffer = new CommandBuffer();
    57.         m_releaseCommandBuffer.name = "Release ARBackground";
    58.         m_releaseCommandBuffer.ReleaseTemporaryRT(WORKING_RENDER_TEXTURE_ID);
    59.         aRCamera.AddCommandBuffer(CameraEvent.AfterSkybox, m_releaseCommandBuffer);
    60.  
    61.         isCapturing = true;
    62.     }
    63.  
    64.     private void OnEnable()
    65.     {
    66.         pastSkyboxMaterial = RenderSettings.skybox;
    67.         RenderSettings.skybox = skyboxMaterial;
    68.         ARSubsystemManager.cameraFrameReceived += OnCameraFrameReceived;
    69.     }
    70.  
    71.     private void Update()
    72.     {
    73.         skyboxMaterial.SetMatrix(WORLD_TO_CAMERA_MATRIX_PROP_ID, aRCamera.worldToCameraMatrix);
    74.     }
    75.  
    76.     private void OnCameraFrameReceived(ARCameraFrameEventArgs eventArgs)
    77.     {
    78.         BlitToRenderTexture(renderTexture, aRCameraBackground);
    79.     }
    80.  
    81.     private void OnDisable()
    82.     {
    83.         RenderSettings.skybox = pastSkyboxMaterial;
    84.         ARSubsystemManager.cameraFrameReceived -= OnCameraFrameReceived;
    85.         // Clean up any previous command buffer and events hooks
    86.         if (m_blitCommandBuffer != null && aRCamera != null)
    87.         {
    88.             aRCamera.RemoveCommandBuffer(CameraEvent.BeforeForwardOpaque, m_blitCommandBuffer);
    89.             aRCamera.RemoveCommandBuffer(CameraEvent.AfterSkybox, m_releaseCommandBuffer);
    90.         }
    91.         isCapturing = false;
    92.     }
    93.  
    94.     public static void BlitToRenderTexture(RenderTexture renderTexture, ARCameraBackground cameraBackground)
    95.     {
    96.         if (renderTexture == null)
    97.             throw new ArgumentNullException("renderTexture");
    98.  
    99.         if (cameraBackground == null)
    100.             throw new ArgumentNullException("cameraBackground");
    101.  
    102.         // Copy the camera background to a RenderTexture
    103.         Graphics.Blit(null, renderTexture, cameraBackground.material);
    104.     }
    105.  
    106.     private static readonly int WORLD_TO_CAMERA_MATRIX_PROP_ID = Shader.PropertyToID("_WorldToCameraMatrix");
    107.     private static readonly int WORKING_RENDER_TEXTURE_ID = Shader.PropertyToID("_ARCameraRenderTexture");
    108. }
    109.  
    Shader for repo (https://github.com/johnsietsma/ARCameraLighting)

    Code (Shader):
    1.  
    2. Shader "AR/ARSkybox"
    3. {
    4.     Properties
    5.     {
    6.         _LightingTex("Render Texture", 2D) = "white" {}
    7.     }
    8.  
    9.     CGINCLUDE
    10.  
    11.     #include "UnityCG.cginc"
    12.  
    13.     struct appdata
    14.     {
    15.         float4 position : POSITION;
    16.         float3 normal : NORMAL;
    17.         float3 texcoord : TEXCOORD0;
    18.     };
    19.  
    20.     struct v2f
    21.     {
    22.         float4 position : SV_POSITION;
    23.         float2 texcoord : TEXCOORD0;
    24.     };
    25.  
    26.     // This relies on a RenderTexture of this name being created in ARCoreCameraRenderTexture.cs.
    27.     sampler2D _LightingTex;
    28.     float4x4 _WorldToCameraMatrix;
    29.  
    30.     float2 SphereMapUVCoords( float3 viewDir, float3 normal )
    31.     {
    32.         // Sphere mapping. Find reflection and tranform into UV coords.
    33.         // Heavily inspired by https://www.clicktorelease.com/blog/creating-spherical-environment-mapping-shader/
    34.         float3 reflection = reflect(viewDir, normal);
    35.         float m = 2. * sqrt(
    36.             pow(reflection.x, 2.) +
    37.             pow(reflection.y, 2.) +
    38.             pow(reflection.z + 1., 2.)
    39.         );
    40.         return reflection.xy / m + .5;
    41.     }
    42.  
    43.     v2f vert(appdata v)
    44.     {
    45.         // Create a sphere map with a texture whose center is at the viewDir/sphere intersection.
    46.         // The texture is wrapped around the sphere so that the corners meet directly behind the camera.
    47.         // To do this we could operate in static viewDir (0,0,1) space. We always want to look at the center on the texture.
    48.         // When we move the phone, there is no need to change the view direction.
    49.         // When rendering a skybox, the view direction is altered for each face. Grab the world space view direction to each vert
    50.         //  then reverse the camera's view direction, bringing it back to view space.
    51.         float3 viewDir = -normalize(WorldSpaceViewDir(v.position));
    52.         viewDir = mul(_WorldToCameraMatrix, float4(viewDir,0));
    53.  
    54.         v2f o;
    55.         o.position = UnityObjectToClipPos(v.position);
    56.         o.texcoord = SphereMapUVCoords(viewDir, v.normal);
    57.  
    58.         return o;
    59.     }
    60.  
    61.     fixed4 frag(v2f i) : COLOR
    62.     {
    63.         return tex2D(_LightingTex, i.texcoord);
    64.     }
    65.  
    66.     ENDCG
    67.  
    68.     SubShader
    69.     {
    70.         Tags{ "RenderType" = "Background" "Queue" = "Background" }
    71.             Pass
    72.         {
    73.             ZWrite Off
    74.             Cull Off
    75.             Fog{ Mode Off }
    76.             CGPROGRAM
    77. #pragma fragmentoption ARB_precision_hint_fastest
    78. #pragma vertex vert
    79. #pragma fragment frag
    80.             ENDCG
    81.         }
    82.     }
    83. }
    84.  
     
    christougher likes this.
  22. christougher

    christougher

    Joined:
    Mar 6, 2015
    Posts:
    558
    Thx so much for sharing! :D I"m having trouble getting it to work though... :( Does this require anything else from the ARCameraLighting project? I've created and assigned the necessary components onto ARDeviceScreenReflections on a Reflection Probe. I allowed realtime reflections in settings... I'm stumped...

    The only way I've gotten it to work is to set the camera to target the CameraRenderTexture and have that assigned to a Raw Image.... not ideal. Could you possibly email me a basic scene file with it setup? christougher@hotmail.com. I'd be happy to detail the instructions here for anyone else. Again thanks so much for sharing!


     
  23. Sebastian_Trick3d

    Sebastian_Trick3d

    Joined:
    Dec 6, 2018
    Posts:
    3
    Okay, I'm not sure but for us, this worked only after fiddling with the reflection probes--->
    Make sure you have enabled reflection probes in the settings, make sure a large enough reflection probe is in the scene, the meshes have the blend probe option, and that they are capturing at an equal resolution to your render texture (width and height both), the reflection probe is realtime, and the same reflection probe is refreshing every frame. We tested on Unity version 2018.3.8f1 on iPhone8.

    Besides that, check that the environment reflections are also the same resolution of all the other things, AND that the environment reflections source is set to skybox.
     
    Last edited: Apr 15, 2019
    christougher likes this.
  24. christougher

    christougher

    Joined:
    Mar 6, 2015
    Posts:
    558
    Thanks so much, I'll check it out!
     
  25. christougher

    christougher

    Joined:
    Mar 6, 2015
    Posts:
    558
    Working!!!! Got it going on Android on my Pixel 2. I couldn't figure out what was the problem was with my scene. Everything was still showing up black. I finally saw that your rendertexture was set to No Depth Buffer. After making that change everything worked as it should.
     
    Last edited: Apr 16, 2019
    Blarp and Sebastian_Trick3d like this.
  26. AlCampbell

    AlCampbell

    Joined:
    Jan 8, 2018
    Posts:
    4
    Quick question on this, are CameraConfigurations compatible with using XRCameraSubsystem.TryGetLatestImage to get the raw image from the CPU? For some reason, trying to set the CameraConfiguration is preventing us from getting an image where we otherwise have no problem.
     
  27. Baraneedharan

    Baraneedharan

    Joined:
    Jan 10, 2019
    Posts:
    4
    Hi tdmowrer,

    I am new to ARFoundation, my requirements is to position the 3d content on both detected Vertical and Horizontal QR code pose. In ARkit i am using "ARTextureHandles handles = arSession.GetARVideoTextureHandles();" to get values. please advise on the syntax to be used in ARFoundation for this scenario .
     
  28. JM_CG

    JM_CG

    Joined:
    Apr 20, 2017
    Posts:
    35
    Is blitting the CameraTexture working in 2019 with AR Foundation/has worked for anyone else? I seem to get errors on iOS.
     
  29. eco_bach

    eco_bach

    Joined:
    Jul 8, 2013
    Posts:
    1,601
  30. tdmowrer

    tdmowrer

    Joined:
    Apr 21, 2017
    Posts:
    605
    Sebastian_Trick3d likes this.
  31. eco_bach

    eco_bach

    Joined:
    Jul 8, 2013
    Posts:
    1,601
    Thanks!
    So I am successfully grabbing the ARCamerabackground with Graphics.Blit but how do do this without actually rendering the source ARCamerabackground?
    ie I only want to see the Rendertexture I am targeting.

    Is it necessary to modify the ARCameraBackground component or is there another way?
     
    Last edited: Jun 17, 2019
  32. austintaylorx

    austintaylorx

    Joined:
    Aug 13, 2018
    Posts:
    8
    Funny enough, My issue is the exact opposite, I actually want the unity objects in my picture, but they're not showing up.

    Code (CSharp):
    1. using UnityEngine;
    2. using UnityEngine.XR.ARFoundation;
    3. using System;
    4. using Infrastructure.CoroutineRunner;
    5.  
    6. namespace Infrastructure.CCSystem
    7. {
    8.     public class CameraCaptureSystem : ICameraCaptureSystem
    9.     {
    10.         public static Texture2D m_Texture;
    11.  
    12.         private RenderTexture renderTexture;
    13.         private Texture2D lastCameraTexture;
    14.  
    15.         private ARCameraBackground aRCameraBackground;
    16.         private ICoroutineRunner coroutineRunner;
    17.  
    18.         public CameraCaptureSystem(
    19.             ARCameraBackground aRCameraBackground,
    20.             RenderTexture renderTexture,
    21.             ICoroutineRunner coroutineRunner)
    22.         {
    23.             this.aRCameraBackground = aRCameraBackground;
    24.             this.renderTexture = renderTexture;
    25.             this.coroutineRunner = coroutineRunner;
    26.  
    27.             RenderTexture.active = renderTexture;
    28.         }
    29.  
    30.         public void CapturePhoto()
    31.         {
    32.             Graphics.Blit(null, renderTexture, aRCameraBackground.material);
    33.  
    34.             var activeRenderTexture = RenderTexture.active;
    35.             RenderTexture.active = renderTexture;
    36.             if (lastCameraTexture == null)
    37.                 lastCameraTexture = new Texture2D(renderTexture.width, renderTexture.height, TextureFormat.RGB24, true);
    38.             lastCameraTexture.ReadPixels(new Rect(0, 0, renderTexture.width, renderTexture.height), 0, 0);
    39.             lastCameraTexture.Apply();
    40.             RenderTexture.active = activeRenderTexture;
    41.  
    42.             m_Texture = lastCameraTexture;
    43.         }
    44.     }
    45. }
    46.  
    47.  
    48.  
    I'm in Unity 2019.1.6f1
    AR foundation 2.2.0
    AR core 2.1.0

    How can I get objects I spawn in ar to show up in a picture?
     
  33. sameel

    sameel

    Joined:
    Dec 4, 2015
    Posts:
    17
    I am getting a black screen instead, Here is what I am trying

    Code (CSharp):
    1. Texture2D cameraTexture = new Texture2D( (int)width, (int) height, TextureFormat.RGB24, false);
    2.         RenderTexture rt = new RenderTexture((int)width, (int)height, 24, RenderTextureFormat.ARGB32);
    3.         RenderTexture.active = rt;
    4.  
    5.         arCamera.targetTexture = rt;
    6.         arCamera.Render();
    7. cameraTexture.ReadPixels(new Rect(0, 0, width, height), 0, 0);
    8.         cameraTexture.Apply();
    9.  
    10.         RenderTexture.active = null;
    11.         RenderTexture.ReleaseTemporary(rt);
    12.         Destroy(rt);
    13. File.WriteAllBytes(screenShotPath, cameraTexture.EncodeToPNG());
     
  34. tdmowrer

    tdmowrer

    Joined:
    Apr 21, 2017
    Posts:
    605
    Seems like it should work. The docs suggest using Graphics.Blit to achieve a similar result. Also note that ReadPixels is very slow (~20 ms) and should be avoided if possible. There's a separate API for accessing the camera image on the CPU.
     
  35. x2stone

    x2stone

    Joined:
    Dec 17, 2014
    Posts:
    22
    Hi tdmowrer,
    I don't understand well, which is the most efficient method to get the camera image on iOS ? The one you wrote here (which use ReadPixels function that you say it should be avoid) or the "CPU" method that you links in your last post ?
    I just want to get the iPhone camera feed (so, nothing more than that the camera sees) and encode it to MP4 (while seeing the AR stuff of my application on my iPhone)
    Thanks for your explanation.
     
  36. sameel

    sameel

    Joined:
    Dec 4, 2015
    Posts:
    17
    Hi @tdmowrer, I tried using Graphics.Blit and there is not AR Object in the image that gets saved.
     
  37. tdmowrer

    tdmowrer

    Joined:
    Apr 21, 2017
    Posts:
    605
    It sounds like you need the camera image on the CPU, so you should use the API I linked for accessing the camera image on the CPU.
     
  38. sameel

    sameel

    Joined:
    Dec 4, 2015
    Posts:
    17
    @tdmowrer, I have tried using the API and also tried using Graphics.Blit, but the result is same. I am unable to get the AR object in the Image that gets saved.
     
  39. tdmowrer

    tdmowrer

    Joined:
    Apr 21, 2017
    Posts:
    605
    The methods discussed here only provide the camera texture; it would not include any virtual content in your scene. What is your use case? Are you just trying to take a screenshot?
     
  40. sameel

    sameel

    Joined:
    Dec 4, 2015
    Posts:
    17
    Yeah I am trying to take a screen shot from AR camera and apply an overlay on the final image. So, I am assigning a rendertexture to AR camera and then using that renderTexture in another camera to apply overlay on the screenshot. It is working fine for Android but for iOS the screen turns black as soon as a renderTexture is assigned to the AR camera.
    Here is what I am trying and working fine for Android but not for iOS.

    Code (CSharp):
    1. Texture2D cameraTexture = new Texture2D( (int)width, (int) height, TextureFormat.RGB24, false);
    2.         RenderTexture rt = new RenderTexture((int)width, (int)height, 24, RenderTextureFormat.ARGB32);
    3.         RenderTexture.active = rt;
    4.         arCamera.targetTexture = rt;
    5.         arCamera.Render();
    6.         arCamera.targetTexture = null;
    7.        
    8.         overlayCamera.gameObject.SetActive( true );
    9.         overlayCamera.targetTexture = rt;
    10.         overlayCamera.Render();
    11.         overlayCamera.targetTexture = null;
    12.         overlayCamera.gameObject.SetActive( false );
    13. cameraTexture.ReadPixels(new Rect(0, 0, rt.width, rt.height), 0, 0);
    14.         cameraTexture.Apply();
    15.         RenderTexture.active = null;
    16.         RenderTexture.ReleaseTemporary(rt);
    17.         Destroy(rt);
    18. File.WriteAllBytes(screenShotPath, cameraTexture.EncodeToPNG());
     
  41. sameel

    sameel

    Joined:
    Dec 4, 2015
    Posts:
    17
    The reason I am not using ScreenCapture.CaptureScreenshotIntoRenderTexture(rt); because I want to save a screenshot with specific resolution.
     
  42. naokiijobs

    naokiijobs

    Joined:
    Jun 15, 2018
    Posts:
    1
    I'd like to know the relationship between XRCameraIntrinsics.principalPoint and NativeArray<byte> from XRCameraImage.ConvertAsync.

    According to the document, the principal point indicates the top-left of the image However NativeArray<byte> shows vertical flipping(MirrorX) image by default.
    https://docs.unity3d.com/Packages/c...ngine.XR.ARSubsystems.XRCameraIntrinsics.html

    When I access the principal point in the NativeArray<byte>, which index is correct?
    Code (CSharp):
    1. //var x = XRCameraIntrinsics.principalPoint.x;
    2. //var y = XRCameraIntrinsics.principalPoint.y;
    3.  
    4. public void ConvertAsync(XRCameraImageConversionParams conversionParams, Action<AsyncCameraImageConversionStatus, XRCameraImageConversionParams, NativeArray<byte>> onComplete)
    5. {
    6.    var principalPointData = onComplete[XRCameraImageConversionParams.outputDimensions.x * y + x];
    7. }

    Or,

    Code (CSharp):
    1. //var x = XRCameraIntrinsics.principalPoint.x;
    2. //var y = XRCameraIntrinsics.principalPoint.y;
    3.  
    4. public void ConvertAsync(XRCameraImageConversionParams conversionParams, Action<AsyncCameraImageConversionStatus, XRCameraImageConversionParams, NativeArray<byte>> onComplete)
    5. {
    6.    var principalPointData = onComplete[XRCameraImageConversionParams.outputDimensions.x * (XRCameraImageConversionParams.outputDimensions.y-y) + x];
    7. }
     
    Last edited: Jan 25, 2020
  43. christougher

    christougher

    Joined:
    Mar 6, 2015
    Posts:
    558
    Sorry if this should be a new topic... but once I've gotten the camera image is there any way to selectively darken the bright spots of the image? If I have parts of the image that are too white/bright it messes up my image effects... If I could somehow limit the RGB ints to like 150 or 200 that would be perfect... any ideas?

    Here's how I blit to the Rendertexture...

    Code (CSharp):
    1.     void OnCameraFrameReceived(ARCameraFrameEventArgs eventArgs)
    2.     {
    3.         if (renderTexture != null && arCameraBackground != null)
    4.             BlitToRenderTexture(renderTexture, arCameraBackground);
    5.     }
    6.  
    7.     public static void BlitToRenderTexture(RenderTexture renderTexture, ARCameraBackground cameraBackground)
    8.     {
    9.  
    10.         // Copy the camera background to a RenderTexture
    11.         Graphics.Blit(null, renderTexture, cameraBackground.material);
    12.     }
    13.  
    Since I'm targeting a rendertexture perhaps editing that afterword would be better? I'm kind of lost here... I'm unfortunately not quite as bright as the camera image ;)
     
    Last edited: Feb 1, 2020
    ROBYER1 likes this.
  44. xan_kent

    xan_kent

    Joined:
    Apr 30, 2017
    Posts:
    1
    I have tried the method without luck. I'm trying to make a coloring book.

    How can I Get de AR Camera Background to a render texture in real time?

    I need to get what my phone camera "see" to a Render Texture.

    Inside Unity with a normal camera works without problem, but I cant get the AR Background Image to Render textues
     
  45. christougher

    christougher

    Joined:
    Mar 6, 2015
    Posts:
    558
    Hi, as I can't seem to find a way to update ArEnvironmentProbes to update every frame I'm still trying to use this setup. It still works in ArFoundation 3.01 but on ArFoundation 3.1.0 when building and running on an Iphone I get this error in xcode log:

    Metal: Error creating pipeline state (Unlit/ARKitBackground): depthAttachmentPixelFormat is not valid and shader writes to depth

    (null)Internal: JobTempAlloc has allocations that are more than 4 frames old - this is not allowed and likely a leak

    any ideas how to keep this working? My project is very much reliant on constant realtime AR reflections for the visuals to work and updating them every like 10 seconds is jarring and counterproductive. For now I'm sticking with ArFoundation 3.0.1 as I'm prioritizing reflections above the other improvements.
     
  46. Jon-at-Kaio

    Jon-at-Kaio

    Joined:
    Oct 17, 2007
    Posts:
    185
    Looking for this (or another ) good example of grabbing the camera texture, currently this is the image I end up with using the code here.

    There doesn't appear to be any blit examples in the samples repo
     

    Attached Files:

  47. Eastwing

    Eastwing

    Joined:
    Mar 30, 2013
    Posts:
    26
    Hi austintaylorx
    Any luck with that?
    Anybody, is there any way to achieve what austintaylorx told about?
     
  48. jswy1992

    jswy1992

    Joined:
    Jun 19, 2019
    Posts:
    4
    Hi,Can you help me to solve my problem in this video? I have create a face geometry in ARWorld(I used ARFoundation) with no texture. So how to get my face texture and map it, make it looks like my real face. screenshot.png
     
  49. richard_revesz

    richard_revesz

    Joined:
    Feb 1, 2018
    Posts:
    6
    Hi, I have hard time getting it work on iOS with URP.
    It is working fine on Android, but not on iOS for some reason:

    What I have already tried:

    - I have set the RenderTexture to no depth buffer in the inspector, and also forced from script at awake, logging before and after values. The results show 0 at the beginning as well, so it should be fine.

    - I have also read that for someone the soluiton was to set pass to 0 at
    Graphics.Blit(null, renderTexture, m_ARCameraBackground.material, 0);
    . I have even created a UI option to set the pass to any value, does not help.

    - As on Android OpenGLES is used, I have added the (deprecated) API to the iOS, below Metal. Does not help.

    - Have read on some old Oculus forum that for some reason, reflection probes and AntiAliasing can not be used at the same time, so I disabled FXAA on the camera, no use.

    Tested with Unity 2019.3.6f1 + ARFoundation 4.0.0 and 2019.4.1f1 (LTS)+ARFoundation 4.0.2, iPhone X

    Any help is appreciated, been struggling with this for a while now.

    Thanks,
    Richard
     
    Last edited: Jun 25, 2020
  50. christougher

    christougher

    Joined:
    Mar 6, 2015
    Posts:
    558
    Hi, if anyone for some reason needs this the solution was to change the rendertexture's depth buffer setting from 'no depth buffer' to one of the other settings.