Search Unity

Can anyone use Compute Shaders through the Linux editor?

Discussion in 'Linux' started by CellGames, Dec 11, 2015.

  1. CellGames

    CellGames

    Joined:
    Dec 11, 2015
    Posts:
    3
    I'm new to Unity, and was looking to use Compute Shaders to do some procedural texture generation.
    I'm using the Linux Editor, and I'm having some trouble getting any compute shaders to run. Even example scripts fail to run (as far as I can tell) in both the editor and after building.
    Here's what I'm trying to do:

    I have a Quad using the default material (white), that I attach a Script that calls the dispatch on a ComputeShader, which I'm expecting to change the Quad's texture to red.
    In the editor, no matter how simple the Compute Shader is, I get this error:

    Platform does not support compute shaders
    UnityEngine.ComputeShader:Dispatch(Int32, Int32, Int32, Int32)
    KernelExample:Start() (at Assets/KernelExample.cs:9)

    I was told that building and running the application should work as unity would translate the HLSL shader into GLSL and run that, but the Quad stays white.

    Is anyone able to help me diagnose the issues, and help me get a compute shader off the ground? A 'confirmed working' Script/ComputeShader example would be great, also.

    Another attempt is where I load the TextureExample.unity scene from here, the texture container does pop up. However, it looks something like this...

    https://imgur.com/a/sZILS

    First four images are in the editor (it can display differently each run), after hitting the 'Play' button, and the last two images are the results from running the build. The brown square appeared once only, though.

    My config:

    Unity Editor 5.2.2f1
    Ubuntu 14.04.3 LTS,
    GPU: AMD 390x

    product: Hawaii XT [Radeon R9 290X]
    vendor: Advanced Micro Devices, Inc. [AMD/ATI]
    configuration: driver=fglrx_pci latency=0

    OpenGL vendor string: Advanced Micro Devices, Inc.
    OpenGL version string: 4.4.13374 Compatibility Profile Context 15.20.1013
    Build Info / Player Settings:

    https://imgur.com/a/lQYQf
     
    Last edited: Dec 11, 2015
  2. LukaKotar

    LukaKotar

    Joined:
    Sep 25, 2011
    Posts:
    394
    Unity uses an outdated version of OpenGL at the moment (2.1, if I'm not mistaken). I'm pretty sure compute shaders are not supported. We'll just have to wait for 4.x support.

    Edit: So I did a bit of research, Unity 5.3 added support for OpenGL 3.2/4.1, however 5.2.2 (the current Linux editor version) uses version 2.1, which doesn't support compute shaders.

    Edit 2: Documentation page for OpenGL Core:
    Which means it supports OpenGL up to version 4.5 as well. It may not make a difference for this thread, but I just wanted to correct myself. Hopefully Unity 5.3 editor gets ported to Linux soon.
     
    Last edited: Dec 11, 2015
  3. CellGames

    CellGames

    Joined:
    Dec 11, 2015
    Posts:
    3
    Damn. Compute shaders were the only feature I absolutely needed to use in my project. Anyone know if I can do a feature request or see whether OpenGL 3.2/4.1 support is in the pipeline?
     
  4. cician

    cician

    Joined:
    Dec 10, 2012
    Posts:
    233
    While nothing was said explicitly about the linux editor yet, the compute shaders should be supported on linux target in 5.3 now with the glcore backend. I too am eager to use compute shaders. For now I'm doing my GPGPU stuff the old school with render to texture, but shader model 3 is pretty.. limited.
     
  5. CellGames

    CellGames

    Joined:
    Dec 11, 2015
    Posts:
    3
    Interesting, can you go into a little more detail regarding 'render to texture'? What are its capabilities? Is there any example code using it for GPGPU stuff or is there a tutorial I should view?

    I saw this:
    but it wasn't clear how I could use it for procedural generation or GPGPU stuff (eg. Conway's Game of Life)
     
  6. cician

    cician

    Joined:
    Dec 10, 2012
    Posts:
    233
    There are lots of resources on the net, not specifically to unity. You just need to look for GPGPU and limit yourself to veeery old tutorials/artciles/papers.

    Basically your inputs are one or more of meshes, textures and uniform values, you use the the usual GPU pipeline to process data in shaders and output to RenderTexture off-screen. I recommend familiarizing yourself with CommandBuffers from the start and avoid the old "immediate mode" way (that is Graphics.* and GL.* stuff). If the purpose is to learn then I recommend trying it out. I only understood some compute shader concepts only after I tried it the RenderTexture way.

    Note that there are things you can do this way, albeit less easily and less efficiently compared to compute shaders and things you simply cannot.
    For example I used GPGPU for calculation of lookup textures in my Pre-Integrated Skin shader. People in old days have even done ray tracing on shader model 2/3 hardware, but don't ask me how :p
    No native integer support makes things difficult, but can be worked around for most problems, while lack of random write, counters and indexed array writes make many problems plain impossible to solve.
    Also for runtime use you'll probably want to use async buffer read available from 5.3, IIRC. Unless all your data is meant to stay in GPU memory.
     
  7. cician

    cician

    Joined:
    Dec 10, 2012
    Posts:
    233
    Today I was delighted to discover tessellation and Compute Shaders are actually working in the 5.3 build. Contrary to release notes the glcore backend is used. If it's not, you can try forcing it from player settings. :)
     
    LukaKotar likes this.