Search Unity

  1. Looking for a job or to hire someone for a project? Check out the re-opened job forums.
    Dismiss Notice
  2. Good news ✨ We have more Unite Now videos available for you to watch on-demand! Come check them out and ask our experts any questions!
    Dismiss Notice

Sharing texture data between kernels in compute shader does not work in Linux?

Discussion in 'Linux' started by DMirauta, Apr 6, 2021.

  1. DMirauta

    DMirauta

    Joined:
    Apr 6, 2021
    Posts:
    2
    Hi, I'm new to compute shaders (and also to this forum, apologies if this is in the wrong place, or if I've jumped the gun in this or any other aspect).

    I have been trying to get the following project to run on my computer all day:
    https://github.com/SebLague/Slime-Simulation
    as I wanted to play around with it but it has been a nightmarish experience. In retrospect, I should have accepted earlier than no fixes aught to have been required of me as it would be unlikely that the github contained a non-functional project, but here we are, I've been trying to tweak that code to make it work or to generally debug all day.

    This as, well as the more minimal example I will include below, do not work on my PopOS 20.10 with NVIDIA Driver Version: 460.67 and CUDA Version: 11.2, either in the editor or compiled. I have also tried windows as a target, this does not work in wine, but it runs just fine if I boot into native windows 10.

    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4. using UnityEngine.Experimental.Rendering;
    5.  
    6. public class Main : MonoBehaviour
    7. {
    8.     RenderTexture tex1 , tex2;
    9.     public ComputeShader shader;
    10.  
    11.     // Start is called before the first frame update
    12.     void Start()
    13.     {
    14.         tex1 = new RenderTexture(80, 80, 1, GraphicsFormat.R8G8B8A8_UNorm);
    15.         tex2 = new RenderTexture(80, 80, 1, GraphicsFormat.R8G8B8A8_UNorm);
    16.        
    17.         tex1.enableRandomWrite=true;
    18.         tex2.enableRandomWrite=true;
    19.        
    20.     tex1.Create();
    21.     tex2.Create();
    22.        
    23.         transform.Find("Quad").GetComponentInChildren<MeshRenderer>().material.mainTexture = tex1;
    24.         transform.Find("Quad 2").GetComponentInChildren<MeshRenderer>().material.mainTexture = tex2;
    25.        
    26.         shader.SetTexture(0, "Result1", tex1);
    27.        
    28.         shader.SetTexture(1, "Result1", tex1);
    29.         shader.SetTexture(1, "Result2", tex2);
    30.        
    31.         shader.Dispatch(0, 10, 10, 1);
    32.         shader.Dispatch(1, 10, 10, 1);
    33.     }
    34.  
    35.  
    36.    
    37. }
    Code (CG Shader Language?):
    1. #pragma kernel CSMain
    2. #pragma kernel CSMain2
    3.  
    4. RWTexture2D<float4> Result1;
    5. RWTexture2D<float4> Result2;
    6.  
    7. [numthreads(8,8,1)]
    8. void CSMain (uint3 id : SV_DispatchThreadID)
    9. {
    10.    Result1[id.xy] = float4( ((float) id.x)/80.0, ((float) id.y)/80.0, 0, 1);
    11. }
    12.  
    13. [numthreads(8,8,1)]
    14. void CSMain2 (uint3 id : SV_DispatchThreadID)
    15. {
    16.    Result2[id.xy] = Result1[id.xy];
    17. }
    All I'd like to do is, compute some data in the first kernel and make use of it for further computations in the second, here I'm just copying the data to confirm I can access it just fine, which it seems I cannot. It would be trivial enough to combine these two kernels, but less so in the code from Sebastian Lague (github link).

    A little more info: the first texture draws fine, the second is all black. When running in wine, both are white (and in windows they both render...). In the code from the github (or the version I eventually butchered anyway), some data can still be accessed, but seems to be squeezed to the first half of the x index and the values need to be scaled up by roughly 100x before they can be seen.

    Again, sorry if I'm venting in the wrong place, I somewhat suspect this could be a Linux video driver issue, but then again, who knows, definitely not me.
     
  2. DMirauta

    DMirauta

    Joined:
    Apr 6, 2021
    Posts:
    2
    In case it helps anyone, this seems to be an issue when Unity uses the OpenGL core API, switching to Vulkan fixes this, hallelujah!
     
unityunity