Search Unity

Question Combining multiple textures together using a compute shader.

Discussion in 'Shaders' started by Alzarothie, Feb 13, 2024.

  1. Alzarothie

    Alzarothie

    Joined:
    Jun 30, 2021
    Posts:
    2
    Hey, as the title says I'm trying to create a compute shader that combines multiple textures together into a single result texture. Each texture has its corresponding world size and position that's taken into consideration during the calculations. It is mostly working, but for small changes in the resultCanvasPosWS, there is some periodic variation to the results. I think this is probably caused by the input and result textures having different resolutions, and i need to sort of quantize my input position or UV based on the resolutions, but i just cant seem to figure out how to do it.
    Code (CSharp):
    1.  
    2. #pragma kernel CSMain
    3.  
    4. //Input textures
    5. int textureCount;
    6. StructuredBuffer<float3> canvasPosWS;
    7. StructuredBuffer<float3> canvasSizeWS;
    8. Texture2DArray<float4> inputTextures;
    9. float2 inputTextureDims = float2(257, 257);
    10.  
    11. //Texture that we are drawing on.
    12. float3 resultCanvasPosWS;
    13. float3 resultCanvasSizeWS;
    14. RWTexture2D<float4> resultTexture;
    15.  
    16. SamplerState samplerLinearClamp;
    17.  
    18. [numthreads(8, 8, 1)]
    19. void CSMain(uint3 id : SV_DispatchThreadID)
    20. {
    21.     uint outWidth, outHeight;
    22.     resultTexture.GetDimensions(outWidth, outHeight);
    23.     const float2 resultNormalizedCoords = float2(id.x / (float)outWidth, id.y / (float)outHeight) - 0.5f;
    24.    
    25.     float4 resultColor = float4(0, 0, 0, 0);
    26.     for (int i = 0; i < textureCount; i++)
    27.     {
    28.         //Find uv and add to color if we are in bounds
    29.         const float2 canvasRelativePos = (float2(resultCanvasPosWS.x, resultCanvasPosWS.z) +
    30.             resultNormalizedCoords * float2(resultCanvasSizeWS.x, resultCanvasSizeWS.z) -
    31.             float2(canvasPosWS[i].x, canvasPosWS[i].z)) / float2(canvasSizeWS[i].x, canvasSizeWS[i].z);
    32.         if (canvasRelativePos.x >= 0 && canvasRelativePos.x <= 1 && canvasRelativePos.y >= 0 && canvasRelativePos.y <= 1)
    33.         {
    34.             resultColor += inputTextures.SampleLevel(samplerLinearClamp, float3(canvasRelativePos, i), 0) + (canvasPosWS[i].y / canvasSizeWS[i].y);
    35.         }
    36.     }
    37.    
    38.     resultTexture[id.xy] = resultColor;
    39. }
    40.  
     
  2. Alzarothie

    Alzarothie

    Joined:
    Jun 30, 2021
    Posts:
    2
    I managed to fix it:
    Code (CSharp):
    1.         float factorX = resultTexture.width / resultWorldSize.x;
    2.         float factorY = resultTexture.height / resultWorldSize.z;
    3.         position.x = Mathf.FloorToInt(position.x * factorX) / factorX;
    4.         position.z = Mathf.FloorToInt(position.z * factorY) / factorY;
    Instead of passing in a continuous world position to the compute shader, i first adjust the positions according to the final texture resolution and the area that it is supposed to be covering.