Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

How do I check for Direct3D and flip coordinates in a Compute Shader the correct way?

Discussion in 'Shaders' started by asperatology, Apr 19, 2019.

  1. asperatology

    asperatology

    Joined:
    Mar 10, 2015
    Posts:
    981
    I'm currently using a Compute Shader to render pixels to a Render Texture on a Direct3D-like platform (which is a jargon in Unity Documentation). My "Direct3D-like platform" is Windows 10 x64 PC version.

    The problem I'm encountering with Compute Shaders, while doing the task mentioned above, is rendering in UV space being flipped in Unity.

    In the following Unity Docs article:

    https://docs.unity3d.com/Manual/SL-PlatformDifferences.html#Rendering-in-UV-space

    It suggested that I need to check for
    _ProjectionParams.x
    and check to see if
    x
    is less than 0. If true, then I need to flip the Y coordinates by doing
    id.y = 1 - id.y;


    As mentioned in this thread here:

    https://forum.unity.com/threads/global-shader-variables-in-compute-shaders.471211/

    I cannot use nor reference
    _ProjectionParams
    because of the following quoted reason:

    Thus, I'm wondering what is the proper and correct way of:
    • Checking to see if I'm running the game on a Direct3D-like platform.
    • At the same time, flip the coordinates only if the game is running on a Direct3D-like platform.
    My Compute Shader code:

    Code (CSharp):
    1. // Implementation of Conway's Game of Life.
    2. // 1. Any live cell with fewer than 2 live neighbors will die. Represents "underpopulation".
    3. // 2. Any live cell with 2 or 3 live neighbors will live on to the next generation. Represents "normal population".
    4. // 3. Any live cell with more than 3 live neighbors will die. Represents "overpopulation".
    5. // 4. Any dead cell with exactly 3 live neighbors become a live cell. Represents "reproduction".
    6.  
    7.  
    8. // Each #kernel tells which function to compile; you can have many kernels
    9. #pragma kernel GameOfLife
    10.  
    11. // Size of the Conway's Game of Life canvas.
    12. float Width;
    13. float Height;
    14.  
    15. // The input's texture
    16. sampler2D Input;
    17.  
    18. // Creates a RenderTexture for us to output to, as the result.
    19. RWTexture2D<float4> Result;
    20.  
    21. [numthreads(8, 8, 1)]
    22. void GameOfLife(uint3 id : SV_DispatchThreadID) {
    23.     //RenderTexture coordinates on Direct3D-like platforms (Windows) will be flipped upside down.
    24.     //Need to re-flip the Y coordinates back around to fix this issue.
    25.     //https://docs.unity3d.com/Manual/SL-PlatformDifferences.html
    26.     // IMPROPER WAY OF DOING THIS. THIS COMPUTE SHADER WILL FAIL!
    27.     //if (ProjectionParams.x < 0)
    28.     // END IMPROPER WAY.
    29.  
    30.     //Calculate the current pixel's position.
    31.     float2 position = float2((id.x / Width), (id.y / Height));
    32.  
    33.     //Calculate the current pixel's size.
    34.     float2 pixelSize = float2(1.0 / Width, 1.0 / Height);
    35.  
    36.     //Get the current pixel based on the input texture's level of detail color value.
    37.     float4 currentPixel = tex2Dlod(Input, float4(position.x, position.y, 0, 0));
    38.  
    39.     //Main loop - Blur the rendered input texture until it can't be blurred anymore.
    40.  
    41.     // Start from +Y, and go towards -Y. For each row, calculate the columns. Blurring grabs the average pixel in a 3x3 area, then apply the average to the result texture for that 3x3 area.
    42.     // Create a buffer pixel that will hold the average pixel values of a given 3x3 area.
    43.     float4 neighborPixels = float4(0, 0, 0, 0);
    44.  
    45.     // +Y row
    46.     neighborPixels += tex2Dlod(Input, float4(position.x + pixelSize.x, position.y + pixelSize.y, 0, 0));
    47.     neighborPixels += tex2Dlod(Input, float4(position.x, position.y + pixelSize.y, 0, 0));
    48.     neighborPixels += tex2Dlod(Input, float4(position.x - pixelSize.x, position.y + pixelSize.y, 0, 0));
    49.  
    50.     // Center row
    51.     // We do not include the current pixel, just yet.
    52.     neighborPixels += tex2Dlod(Input, float4(position.x + pixelSize.x, position.y, 0, 0));
    53.     neighborPixels += tex2Dlod(Input, float4(position.x - pixelSize.x, position.y, 0, 0));
    54.  
    55.     // -Y row
    56.     neighborPixels += tex2Dlod(Input, float4(position.x + pixelSize.x, position.y - pixelSize.y, 0, 0));
    57.     neighborPixels += tex2Dlod(Input, float4(position.x, position.y - pixelSize.y, 0, 0));
    58.     neighborPixels += tex2Dlod(Input, float4(position.x - pixelSize.x, position.y - pixelSize.y, 0, 0));
    59.  
    60.     // Game of Life logic
    61.     // Check only the red component of the pixel color, and check the accumulated value.
    62.     if (currentPixel.r > 0.5) {
    63.         // Pixel is alive.
    64.         // First, check Rule 2.
    65.         if (neighborPixels.r > 1.5 && neighborPixels.r < 3.5) {
    66.             // Margin of error is +/-0.5  (2 - 0.5, 3 + 0.5)
    67.             // Pixel is alive, thus we set the final pixel to white
    68.             Result[id.xy] = float4(1, 1, 1, 1);
    69.         }
    70.         else {
    71.             // Pixel is dead, thus we set the final pixel to black.
    72.             Result[id.xy] = float4(0, 0, 0, 1);
    73.         }
    74.     }
    75.     else {
    76.         // Pixel is dead.
    77.         if (neighborPixels.r > 2.5 && neighborPixels.r < 3.5) {
    78.             // Roughly equal to 3, with margin of error +/-0.5, (3 - 0.5, 3 + 0.5)
    79.             // Therefore, pixel is alive thanks to Rule 3.
    80.             Result[id.xy] = float4(1, 1, 1, 1);
    81.         }
    82.         else {
    83.             // Pixel is dead, set the color to black.
    84.             Result[id.xy] = float4(0, 0, 0, 1);
    85.         }
    86.     }
    87. }
    88.  
     
    Last edited: Apr 19, 2019
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,329
  3. asperatology

    asperatology

    Joined:
    Mar 10, 2015
    Posts:
    981
    Can the check mentioned above be put in the Compute Shader?

    If not, knowing that I would have to check it in the C# code, how should I flip the texture's coordinate in the Compute Shader?
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,329
  5. asperatology

    asperatology

    Joined:
    Mar 10, 2015
    Posts:
    981
    Ok, the check actually do work now.

    Code (CSharp):
    1. #if SHADER_API_D3D11
    2.  
    3. #elsif
    4.  
    5. // Using these macros like #if guards.
    6.  
    7. #else
    8.  
    9. #endif
    So now the remaining question is how to flip the texture inside the compute shader.

    I found out that in the compute shader in OP is using the thread ID values as coordinates. I guess this makes everything much harder to do and to manipulate...
     
  6. asperatology

    asperatology

    Joined:
    Mar 10, 2015
    Posts:
    981
    Found the answer myself, and it's just... jarring to say?

    In Unity, probably dating back to 4.X or even earlier, the 3D primitive object, Planes, have the mesh's UV coordinates mapped backwards along the Z axis. This was found from this blogpost here which mentioned:

    https://ericeastwood.com/blog/20/texture-tiling-based-on-object-sizescale-in-unity

    So, texture tiling modified to be (1, -1) with offset (0, 1) is required to get the render texture displayed according to the OpenGL LHS coordinates system in Unity, rendered onto the Unity's default Plane primitive. I don't know why it's like this. Maybe there's a good reasoning why the UV coordinates are mapped in reverse along the Z axis?
     
  7. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,329
    Mesh UVs are a completely different topic than render texture alignment, and don't change based on the platform.

    DirectX and OpenGL use different UV coordinates for textures, with DirectX using the top left as 0,0, and OpenGL (and almost everyone else) using the bottom left as 0,0. Unity solves this by uploading textures to the GPU upside down when using DirectX.

    This might make you think that if the texture UVs start at the top left for DirectX, then surely the top left corner is where rendering starts for DirectX render textures too ... and you'd be wrong! Well, sometimes. DirectX is inconsistent with what corner is the 0,0 pixel is in for render textures & buffers, which is why Unity has checks for Direct3D, UNITY_UV_STARTS_AT_TOP, and _ProjectionParams.x at various places in the code to try to correct for that.

    But, like I said before, mesh UVs are something else. That's completely arbitrary. There's no reason why UVs need to be aligned to the world orientation. There's nothing wrong or "upside down" about that. It's unlikely it was even a conscious choice. Mostly likely someone a decade ago made a plane in some random 3D modelling application and imported it into the project. Whatever default orientation that program had is what got used. And really the UVs for the plane mesh are rotated 180 degrees from the world xz coordinate alignment, the z isn't flipped flipped by itself. If Unity used a right handed coordinate system, then there would be a good "reason" why they don't match in that if they did the texture would be mirrored since Unity assumes OpenGL UV convention.
     
  8. asperatology

    asperatology

    Joined:
    Mar 10, 2015
    Posts:
    981
    Ok. Going back to the original question about flipping the rendering orientation from upside down to right side up, without relying on texture tiling and texture offsets, is the following the correct way of doing it?

    Assuming the SV_ThreadDispatchID is a Thread object, with its ID based on the thread's (X, Y) coordinates which can be used to render a texel onto the screen, and the (x, y) values in each thread will always be positive...

    I need to first convert the (x, y) to be in the range [0...1]. And then negate the Y coordinates with 1.

    Code (CSharp):
    1.         float3 position = float3( (id.x / TextureWidth, id.y / TextureHeight, 0.0);
    2. #if UNITY_UV_STARTS_AT_TOP
    3.         position.y = 1 - position.y;
    4. #endif
    5. #if UNITY_REVERSED_Z
    6.         position.z = 1 - position.z;
    7. #endif
    Or I just need only the top part?
     
  9. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,329
    This is unrelated to UV coordinates. This has to do with depth buffers, projection matrices, and clip space. For compute shaders, unless you’re reading or writing depth, this isn’t something you need to worry about.


    You’re on the right idea, but this doesn’t necessarily work out properly. It’s much simpler to just do this:
    Code (csharp):
    1. #if UNITY_UV_STARTS_AT_TOP
    2. id.y = TextureHeight - id.y - 1;
    3. #endif[code]
     
  10. asperatology

    asperatology

    Joined:
    Mar 10, 2015
    Posts:
    981
    Ahhh. I see. Thank you for pointing me to the right direction.