Search Unity

[SOLVED] Best way to unwrap cubemap to atlas texture?

Discussion in 'Shaders' started by neoshaman, Oct 25, 2019.

  1. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    SO basically I'm trying to transfer content of a temporary cubemap, that capture the scene, into an atlas of flat environement map (using the octohedron mapping).

    my plan was to:
    1.render a single cubemap at position
    2.read and write the cubemap, using custom render texture's shader, to an update zone
    3. repeat from 1 until all probe are baked.

    However, documentation and example about custom render texture are sparse, especially when dealing with update zone.

    That part of the manual say that:
    https://docs.unity3d.com/Manual/class-CustomRenderTexture.html
    Code (CSharp):
    1. customRenderTexture.updateZones = updateZones1;
    2. customRenderTexture.Update();
    3. customRenderTexture.updateZones = updateZones2;
    4. customRenderTexture.Update();
    I'm not sure how to handle that right now, I have no idea if there is support for multiple update of multiple zones.

    MY code currently is

    Code (CSharp):
    1. void renderProbe(){
    2.         Camera UVCapture = Capture.GetComponent<Camera>();
    3.         List<CustomRenderTextureUpdateZone> zones = new List<CustomRenderTextureUpdateZone>();
    4.         IndirectionProbeAtlas.GetUpdateZones(zones);
    5.  
    6.        
    7.         UVCapture.SetReplacementShader(UV, "RenderType");
    8.         UVCapture.backgroundColor = Color.blue;
    9.         UVCapture.clearFlags = CameraClearFlags.SolidColor;
    10.         int i = 0;
    11.         foreach(GameObject probe in ProbeArray){
    12.             Capture.transform.position = probe.transform.position;
    13.             Capture.transform.rotation = Quaternion.identity;
    14.             UVCapture.RenderToCubemap(SceneCapture);
    15.             //project cubemap to atlas -> in shader:
    16.             //sample cubemap with perpixel octomaping 128 tile in atlas
    17.        
    18.             i++;
    19.         }
    20.         UVCapture.enabled = false;
    21.     }
    Please forgive me for the:
    Code (CSharp):
    1. int i = 0;    foreach
    It's prototype code.

    I'm not using array of cubemap because gles 2.0 don't support it, hence the need to write to an atlas representation to minimize texture fetching.
     
    Last edited: Oct 26, 2019
  2. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    Here is my problem, I have a camera that render to the cubemap, how do activate it so it count as rendering a frame, without it rendering to the main windows? Where I hijack the code so my atlas get filled correctly?

    This making me mad lol

    Code (CSharp):
    1. void renderProbe(){
    2.         Camera UVCapture = Capture.GetComponent<Camera>();
    3.         List<CustomRenderTextureUpdateZone> zones = new List<CustomRenderTextureUpdateZone>();
    4.         CustomRenderTextureUpdateZone[] updating = new CustomRenderTextureUpdateZone[1];
    5.         IndirectionProbeAtlas.GetUpdateZones(zones);
    6.          
    7.         UVCapture.SetReplacementShader(UV, "RenderType");
    8.         UVCapture.backgroundColor = Color.blue;
    9.         UVCapture.clearFlags = CameraClearFlags.SolidColor;
    10.  
    11.         int i = 0;
    12.         foreach(GameObject probe in ProbeArray){
    13.             //render at position on grid
    14.             Capture.transform.position = probe.transform.position;
    15.             Capture.transform.rotation = Quaternion.identity;
    16.             UVCapture.RenderToCubemap(SceneCapture);
    17.          
    18.             //project cubemap to atlas -> in shader:
    19.             //sample cubemap with perpixel octomaping 128 tile in atlas
    20.          
    21.             //set the update zone
    22.             updating[0] = zones[i];
    23.          
    24.             IndirectionProbeAtlas.SetUpdateZones(updating);
    25.             IndirectionProbeAtlas.Update();
    26.             i++;
    27.         }
    28.         UVCapture.enabled = false;
    29.     }
    as expected I got just one corner ...

    upload_2019-10-27_1-40-1.png

    It has ignored all the zones and only cared about the last one!

    How do I make it so it fills everything?
     
  3. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    How do I force a frame?
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,343
    My honest thought would be ... why bother trying to get custom render textures to do what you want when you could just call Blit() manually on a render texture and get way more control? You can either pass in a scale & offset and clip the UV area outside of 0.0 and 1.0, or you can go even more manual with GL calls and draw your own quad, scaled or using GL.Viewport, to limit the region to render to.
     
  5. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    Because I'm converting to an octahedron mapping, and it was the simplest when evaluating way to do it. The shader is darn simple with custom render, I just convert the Uv position to a normal, read the cubemap and it output the color. I asked in case I missed some nuance on the update stuff.

    Blit seems great if the atlas conserved the cubemap face, but then I spend extra space for all 6 space, which reduce the amount of packed cubemap, also octahedron reading is trivial. Unless I missed something. Also blit use more bandwidth? That last point isn't important as the update would be infrequent anyway.

    I'm not too knowledgeable about GL type stuff I'll document myself to see. Thanks.
     
  6. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,343
    A “custom render texture” is just a way to setup automatic blit() calls to a render texture without having to write any code. As soon as you’re having to interact with a custom render texture with script, it’s usefulness drops quickly. As far as the API calls are concerned, they’re identical. A Blit() call is “render to this render texture with this shader pass”.
     
  7. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    Okay, it's just that whenever I hear "render", I feel like the whole pipeline get involved with mesh rendering, I haven't done any low level shader, and all tutorials never talk about a pixel only pipeline. To me custom render was that door to pixel only rendering, texture to texture.

    Do you have any good entry point that would help me?
     
  8. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,343
    Code (csharp):
    1. public class FauxCustomRenderTexture() : MonoBehaviour
    2. {
    3.     public RenderTexture renderTexture;
    4.     public Material material;
    5.     public int shaderPassIndex = -1;
    6.  
    7.     void Update()
    8.     {
    9.         if (renderTexture == null || material == null)
    10.             return;
    11.        
    12.         Graphics.Blit(null, renderTexture, material, shaderPassIndex);
    13.     }
    14. }
    There you go, that implements the basic custom render texture in it's entirety.

    For the update zones it appears to be setting some parameters on the material to manipulate the vertex positions in the shader, so it's doing the scale, offset, and rotation there. It just passes in an essentially blank mesh. If your shader is already using the CustomRenderTextureVertexShader as the vertex shader, you could do this instead of the Blit():
    Code (csharp):
    1. Graphics.SetTarget(renderTexture); // set your render texture as the current render target
    2. GL.PushMatrix(); // save off the current view & projection matrices
    3. GL.LoadOrtho(); // set the current view & projection matrices to a basic orthographic matrix
    4. material.SetPass(0); // set the first pass of your material to be the one that will be rendered
    5. GL.Begin(GL.TRIANGLE_STRIP); // start setup of triangles
    6. GL.Vertex3(0f,0f,0f); // 0
    7. GL.Vertex3(1f,0f,0f); // 1
    8. GL.Vertex3(1f,1f,0f); // 2
    9. GL.Vertex3(0f,1f,0f); // 3 these values don't actually mater if you're using the CustomRenderTextureVertexShader
    10. GL.End(); // actually render the above geometry to the render texture
    11. GL.PopMatrix();
    One note of warning, the above code won't actually work when using a CustomRenderTextureVertexShader since it assumes a bunch of data is being set that, well, isn't being set.
     
    neoshaman likes this.
  9. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    I'll check all of that, I have also seen the reference recommend commandbuffer, i'll check all of that.

    The goal would be to generate the atlas at startup (for now) I wonder if it would work in start, or is tied to the update lifecycle. I'm supposed to try built in pipeline before srp lol oh well.
     
  10. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,343
    It'll work at any time. Awake, Start, Update, whenever.
     
    neoshaman likes this.
  11. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    I'm stumped
    I can't get anything working
    here is my last code
    Code (CSharp):
    1.     void setProbeCapture(){
    2.         placeProbe();
    3.      
    4.         Capture = new GameObject("CaptureScene");
    5.         Capture.AddComponent<Camera>();
    6.         UVCapture = Capture.GetComponent<Camera>();
    7.         UVCapture.SetReplacementShader(UV, "RenderType");
    8.         UVCapture.backgroundColor = Color.blue;
    9.         UVCapture.clearFlags = CameraClearFlags.SolidColor;
    10.         UVCapture.allowMSAA = false;
    11.         //UVCapture.targetTexture = SceneCapture;
    12.      
    13.         zones = new List<CustomRenderTextureUpdateZone>();
    14.         IndirectionProbeAtlas.GetUpdateZones(zones);
    15.         updating = new CustomRenderTextureUpdateZone[1];
    16.         IndirectionProbeAtlas.Initialize();
    17.  
    18.     }
    19.     void renderProbe(){
    20.         GameObject probe;
    21.         UVCapture.enabled = true;
    22.  
    23.         for (int i = 0; i<ProbeArray.Length;i++){
    24.             //render at position on grid
    25.             probe = ProbeArray[i];
    26.             Capture.transform.position = probe.transform.position;
    27.             Capture.transform.rotation = Quaternion.identity;
    28.             UVCapture.RenderToCubemap(SceneCapture);
    29.             //update zone
    30.             updating[0] = zones[i];
    31.             IndirectionProbeAtlas.SetUpdateZones(updating);
    32.          
    33.             float size = IndirectionProbeAtlas.width / updating[0].updateZoneSize.x;
    34.             Vector2 position = updating[0].updateZoneCenter;
    35.      
    36.             updatingAtlas(size, position);
    37.             //IndirectionProbeAtlas.Update();
    38.             //Graphics.Blit(null,IndirectionProbeAtlas, atlasTransfer,-1);
    39.         }
    40.         UVCapture.enabled = false;
    41.     }
    42.  
    43.     void Start(){      
    44.         setProbeCapture();
    45.         renderProbe();
    46.     }
    Code (CSharp):
    1.     void updatingAtlas(float size, Vector2 offset){
    2.         Graphics.SetRenderTarget(IndirectionProbeAtlas); // set your render texture as the current render target
    3.         GL.PushMatrix(); // save off the current view & projection matrices
    4.         GL.LoadOrtho(); // set the current view & projection matrices to a basic orthographic matrix
    5.         atlasTransfer.SetPass(0); // set the first pass of your material to be the one that will be rendered
    6.         //float size = 0.0625f;// 1/16;
    7.         //float offset = 0f;
    8.         offset = new Vector2(offset.x - (size/2),offset.y - (size/2));
    9.         Vector2 s = new Vector2 (size + offset.x,size + offset.y);
    10.         offset /=IndirectionProbeAtlas.width;
    11.         s /=IndirectionProbeAtlas.width;
    12.         Debug.Log(s);
    13.         GL.Begin(GL.TRIANGLE_STRIP); // start setup of triangles
    14.         GL.Vertex3(offset.x,    offset.y,    0f); // 0
    15.         GL.Vertex3(s.x,            offset.y,    0f); // 1
    16.         GL.Vertex3(s.x,            s.y,        0f); // 2
    17.         GL.Vertex3(offset.x,    s.y,        0f); // 3 these values don't actually mater if you're using the CustomRenderTextureVertexShader
    18.         GL.End(); // actually render the above geometry to the render texture
    19.         GL.PopMatrix();
    20.     }
    What do I do wrong?

    All I get is a black void, I changed the shader to ouput debug data
    Code (CSharp):
    1. Shader "MAGIC/AtlasTransfer"
    2. {
    3.     Properties
    4.     {
    5.         _Cube ("Cubemap", CUBE) = "" {}
    6.     }
    7.     SubShader
    8.     {
    9.         Pass
    10.         {
    11.             CGPROGRAM
    12.             //#include "UnityCustomRenderTexture.cginc"
    13.             //#pragma vertex CustomRenderTextureVertexShader
    14.             #pragma vertex vert
    15.             #pragma fragment frag
    16.            
    17.              #include "UnityCG.cginc"
    18.  
    19.             struct appdata
    20.             {
    21.                 float4 vertex : POSITION;
    22.                 float2 uv : TEXCOORD0;
    23.             };
    24.  
    25.             struct v2f
    26.             {
    27.                 float2 uv : TEXCOORD0;
    28.                 float4 vertex : SV_POSITION;
    29.             };
    30.            
    31.             v2f vert (appdata v)
    32.             {
    33.                 v2f o;
    34.                 o.vertex = UnityObjectToClipPos(v.vertex);
    35.                 o.uv = v.uv;//TRANSFORM_TEX(v.uv, _MainTex);
    36.                 return o;
    37.             }
    38.            
    39.            
    40.            
    41.            
    42.             samplerCUBE _Cube;
    43.  
    44.             float3 UnpackNormalFromOct(float2 f)
    45.             {
    46.                 float3 n = float3(f.x, f.y, 1.0 - abs(f.x) - abs(f.y));
    47.                 float t = max(-n.z, 0.0);
    48.                 n.xy += n.xy >= 0.0 ? -t.xx : t.xx;
    49.                 return normalize(n);
    50.             }
    51.  
    52.             fixed4 frag (v2f i) : COLOR//(v2f_customrendertexture i) : COLOR
    53.             {
    54.                 float2 g = i.uv;
    55.                 float3 normal = UnpackNormalFromOct(g);
    56.                 //float3 normal = i.localTexcoord.xyz;
    57.                 return half4(normal,1);//texCUBE(_Cube, normal);
    58.             }
    59.             ENDCG
    60.         }
    61.     }
    62. }
    63.  
     
  12. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    I have seriously no god damn idea, I tried every command stay black
     
  13. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    By pushing renderProbe into unity's update I got
    upload_2019-11-4_7-31-33.png

    I'm even more confused, the colored ball at the side is the intended debug color, I tried to applied to teh whole texture uv instead of a zone as a debug ... nothing work and this is even more puzzling ...
     
  14. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    Some minor progress, still baffled by the actual rendering ...
    upload_2019-11-4_7-38-2.png
     
  15. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,343
    I could have the quad vertices in the wrong order. You might also try adding
    Cull Off
    to your shader to see if it's upside down.
     
  16. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    Your code is verbatim the example unity give, but I'll try that
     
  17. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,343
    Your mistake was assuming my code (or Unity's documentation) was correct. ;) Also I forgot to add UVs since I was thinking about it from the point of view of the custom render texture vertex shader which constructs UVs in the shader.

    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4.  
    5. [ExecuteInEditMode]
    6. public class CustomBlit : MonoBehaviour
    7. {
    8.     public RenderTexture renderTexture;
    9.     public Material material;
    10.  
    11.     // Update is called once per frame
    12.     void Update()
    13.     {
    14.         if (renderTexture == null || material == null)
    15.             return;
    16.  
    17.         Graphics.SetRenderTarget(renderTexture); // set your render texture as the current render target
    18.         GL.PushMatrix(); // save off the current view & projection matrices
    19.         GL.LoadOrtho(); // set the current view & projection matrices to a basic orthographic matrix
    20.         material.SetPass(0); // set the first pass of your material to be the one that will be rendered
    21.         GL.Begin(GL.TRIANGLE_STRIP); // start setup of triangles
    22.  
    23.         // 0
    24.         GL.TexCoord(new Vector3(0f,0f,0f));
    25.         GL.Vertex3(0f,0f,0f);
    26.  
    27.         // 1
    28.         GL.TexCoord(new Vector3(0f,1f,0f));
    29.         GL.Vertex3(0f,1f,0f);
    30.  
    31.         // 2
    32.         GL.TexCoord(new Vector3(1f,0f,0f));
    33.         GL.Vertex3(1f,0f,0f);
    34.  
    35.         // 3
    36.         GL.TexCoord(new Vector3(1f,1f,0f));
    37.         GL.Vertex3(1f,1f,0f);
    38.      
    39.         GL.End(); // actually render the above geometry to the render texture
    40.         GL.PopMatrix();
    41.     }
    42. }
     
  18. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    Doh! make sense for the UV, how I didn't notice that :oops:
     
  19. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    I'm completely lost, if I don't put renderprobe() running into the update loop, it turns out blank, else it turns out with just a little corner, that corner is not the same distribution of colors so something is happening...

    upload_2019-11-4_22-33-27.png

    Maybe it's unity clip pos function ... I reason .. let's pass teh v.vertex to o.vertex ...
    upload_2019-11-4_22-37-2.png

    :eek:
     
  20. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    I narrow down something, the winding is definitely off but so is the UV ..., and teh size and offset too ...

    But still doesn't run if
    1. I use it in start
    2. I do only one call in update
    :confused:
     
  21. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    Took care of the atlasing distribution
    upload_2019-11-4_23-21-11.png

    which too care of the projection issue
    upload_2019-11-4_23-22-48.png


    That's done

    BUT we still has to precompute it once instead of every frame, and theer I have no idea for now!
     
  22. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    SO I tried that
    Code (CSharp):
    1. int f=0;
    2.     void Update(){
    3.         if (f==1)
    4.         {
    5.             renderProbe();
    6.             print("done");
    7.         }
    8.         ++f;
    9.         print(f);
    10.     }
    For f == 0 I get black screen (before the first frame get rendered), my hypothesis is that it only work when render() phase of the unity lifecycle is reached, so it's a soup up custom render texture. Since the created camera can capture scene from within start, it guess it might also be a camera thing, that is the main camera isn't set up up until the first frame, so I need to ties the compute to a camera first in some way.
     
  23. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Doesn't HDRP atlas probes? you might want to poke about in there.
     
  24. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    HDRP use hardware cubemap ARRAY, ie only high end machine (unless I look it wrong), not accessible in GLES 2.0, so I had to do a manual implementation. That's one step closer to cheap GI, assuming result are good.

    Also hasn't checked new hdrp code either (the octahedron mapping is from the srp source code).
     
  25. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    What the hell are you doing targeting ES2.0? it's 21% restricted to the third world markets. You can't even get these phones in europe or the US any more off the shelf - or at the least have to dig real deep into bargain bins.

    The people running these phones aren't paying customers. In addition even if you are convinced its worth supporting, you will find a way higher incidence of bug reports and support costs. For one guy I think it's pure suicide.
     
  26. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    Yeah that's me, you are spot on :p I'm within that market, that's my phone (cost me 1/4 of my income at +50€)!


    Also if it run on gles 2.0 you bet it will work MUCH better on higher end. Doesn't count as a loss.
     
    Last edited: Nov 5, 2019
  27. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,343
    Most curious. I would expect RenderToCubemap() to be immediate, like Blit() or DrawMeshNow(), and not wait for the main render loop. I haven't used RenderToCubemap() in a few years, but I seem to remember being able to use the render texture from it in literally the next line of code. Maybe I'm mis-remembering, or something has changed?
     
  28. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    NO RenderToCubemap() is immediate, always was
    but the custom GL thingy, to bypass custom render texture, is not...
    I need to figure out what's the deal with that
     
  29. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,343
    That makes no sense. The GL.End() function is literally a direct call to the GPU to render immediately. I render with that kind of stuff all the time and something would be very broken if the result wasn't finished by the next line of code. Very puzzled.
     
    neoshaman likes this.
  30. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    Just in case someone want to check that I'm not crazy, here is the current state of the prototype.
     

    Attached Files:

    Last edited: Nov 5, 2019
  31. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    Also I forgot to say I'm on unity 2019.3.0b4
     
  32. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    Okay I added a
    Debug.Log(Camera.current);
    in the update fonction and it return Null
    is that normal? Does GL need a declared camera? I still testing stuff.
     
  33. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    Okay that's not camera related ... still fire null in the update loop, after successfully drawing to texture.
     
  34. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    I tried on 2017.1.1f1 too, so as far back it has the exact same behavior.

    It doesn't make sense to me, but it seems there the need of at least one update cycle, and probably this is apparent because I ONLY do that in this project, because there is no other object? But even then, I spawn many objects, create a camera and do thousand of camera update ... so even the gpu is at least primed .... I'm at lost right now, probably will file it as a bug.


    added a new package for anyone to test, with a scene all set up this time :D
     

    Attached Files:

  35. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    I'm opening a new thread atlas is solved
     
  36. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    @bgolus
    I found the issue, if I replace the custom render texture by a plain render texture I don't have the problem, my guess is that Custom Render Texture aren't initialize before the first update ...

    First rules of forum question, if you ask a question about a problem you have been stuck for days, you are highly likely to find the question right after you ask the question.
     
  37. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,343
    Oh, yeah, I didn’t realize you were still trying to use a custom render texture. Yeah, just don’t use those at all. Hell, I wouldn’t even make a render texture asset if the goal is to bake this out to a saveable texture asset. Just make the render textures in the script, render to it, Blit to it, copy the contents to a texture asset (or create a new one) and release the render textures.
     
    neoshaman likes this.
  38. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    It's only runtime no need to save for now, could probably have a few "jumpstart" texture in the long run, if the experiment is successful.

    I was trying to find the most convenient way to experiment, and since custom texture take care of themselves, I thought I would save time figuring intermediate things out ... how that has turn out so far? lol
    They have a last chance with light computation :D