Search Unity

What are exactly shaders

Discussion in 'Shaders' started by MikeyJY, Jul 17, 2020.

  1. MikeyJY

    MikeyJY

    Joined:
    Mar 2, 2018
    Posts:
    530
    I know that a shader tells the PC how to draw things on screen. But this is vague. I want to learn more about rendering process. First I searched about .shader files and I discovered that it is a variant of HLSL a shader language developed by Microsoft for DirectX. and GLSL for Open GL and Nvidia CG

    First I want to know what DirectX and OpenGL are. Then what a shader is.
    I heard about some ways to render like ray tracing, ray marching and I don't know what they are. And of course what is PBR?
    I want to learn about graphics and writing shaders but I don't know where to start so I decided to ask some things that I heard about but I don't know what they actually are
     
    Last edited: Jul 17, 2020
  2. Invertex

    Invertex

    Joined:
    Nov 7, 2013
    Posts:
    1,546
    That's a lot of topics you're asking about there, I'd recommend just googling "basics" of those things and reading up. For normal shaders, this is a decent resource you can read on:

    https://catlikecoding.com/unity/tutorials/rendering/part-2/

    DirectX and OpenGL are graphics APIs, a given major version of one is also a specification of what features a GPU must have to qualify as supporting that version of the API. Like DX9, 11, 11.1, 12, etc.. You'll see on GPU specification boxes at least what version of DirectX and OpenGL they support. OpenGL is a more "open" API that is cross-platform while DX is proprietary for Microsoft products.

    Shaders are basically the programs that run on the GPU. They are data-oriented programs.
     
    MikeyJY likes this.
  3. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,336
    The catlike coding tutorials are fairly technical still, intended for someone who wants to learn to write shaders, or do basic graphics programming. Trying to get through even the basic tutorial might be overly complex for someone just looking for an answer to "what are shaders".

    The short version of what shaders are: They're simple programs that run on GPUs that take some input data, mesh vertices, texture data, transform matrices, and other arbitrary numerical data, and output a single color value per screen / render target pixel.

    GLSL / HLSL are the shader programming languages for OpenGL and Direct3D respectively. HLSL stands for "High Level Shader Language", and GLSL stands for "OpenGL Shader Language". (GL in OpenGL stands for "Graphics Library", but no one really says that anymore.) These are intended to be human readable / writable programming languages, similar to C, for describing what you want to do with the data being passed to it. This eventually gets compiled into GPU assembly code the GPU actual runs, just like any other C program would need to be compiled to run on a CPU.

    Nvidia's Cg is a long dead shader language & cross compiler intended to be able to compile directly for OpenGL or Direct3D, and is nearly identical to HLSL. Unity used to use it when they first added support for Direct3D (Unity was originally OpenGL only), and a lot of code and documentation still references the term "Cg" in the context of the shaders, but Unity has not used Cg for a long time now. The confusion comes from the fact they switched from Cg to HLSL, but because the two programming languages are so similar, they didn't really have to change anything for it all to keep working. The Cg shaders they had all compiled without issue when fed into an HLSL compiler. For OpenGL, as well as Vulkan (basically OpenGL 5.0), Metal (Apple), and several other proprietary graphics APIs for consoles, Unity uses their own cross compilers & code translators that convert HLSL into the forms needed for those other platforms. But now I'm probably getting too technical too.


    Terms like ray tracing & ray marching describe ways of rendering.

    Ray tracing is the idea that you shoot a ray through the scene and stop at the first thing you hit (or potentially hit several things and figure out which one was the closest). For simple planar geometry, this is pretty simple from a mathematical perspective, but for complex scenes this gets very hard and expensive. And some kinds of things you might want to render don't have a surface to "hit".

    Ray marching means stepping a ray through the scene and at each step asking "am I close to / hitting / inside of something"? Thing about trying to render a cloud or some other volumetric thing. There's no surface to "hit", so instead you have to take smalls steps through the data and calculate what color / opacity it is at each point. This is also very expensive.

    Most GPUs don't do ray tracing yet. Ray marching gets used in limited forms, usually manually done in a shader for certain kinds of effects. But all modern GPUs use rasterization instead. This is a very fast way to take triangles and figure out what pixels they cover on a grid ... like the pixels of a screen.


    While this video still goes a into the weeds, as it is again intended for someone who wants to learn how to write them, I think it's a little easier to follow for people unfamiliar with shaders:
     
    Last edited: Jul 17, 2020
  4. MikeyJY

    MikeyJY

    Joined:
    Mar 2, 2018
    Posts:
    530
    Thank you for your answer! I try to learn more about rendering and writing shaders. I started a really ambitious project with default render pipeline and I can't use shader graph when it was released. I tried to upgrade my project to URP and HDRP, but the project was destroyed. Everything was that annoying pink and I tried to use the unity method to upgrade project materials to HDRP shader but still it wasn't working, so I decided to use the non-scriptable render pipeline for this so I have to write my own shaders.
     
  5. MikeyJY

    MikeyJY

    Joined:
    Mar 2, 2018
    Posts:
    530
    And I have a question: If you are using Visual Studio you probable know that Visual Studio mark compiler errors from your code in c# and auto completes for loops and brackets, place tabs at lines under if/while/for etc. Is there any way to setup Visual Studio with HLSL because currently the compiler errors appear only in unity console. I wished that Visual Studio place a little "x" in front of the line with error and a bad thing that visual studio do to shader is that some instructions doesn't end with ";" and when I hit enter after that kind of line the visual studio place the second line with a tab because it is setup with c and in the c the only lines that are not ending in ; are if, while, for, foreach, etc and it treats .shader files as cs files and place a tab where it shouldn't. So I would like to know if I can mak Visual Studio to separate c from hlsl and pre-compile shaders to tell me the errors.
     
  6. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,336
    Visual Studio doesn't understand HLSL, and especially doesn't understand Unity's shader file syntax. Unity's shader files are called Shader Lab, and they encapsulate shader code as well as render state and material setup information in a single file. There are things you can download for Visual Studio & VS Code to add syntax highlighting for Shader Lab. Search around for "ShaderLab Visual Studio" or "Unity Shader Visual Studio" and you'll find something.

    Personally I use Sublime Text with basic syntax highlighting and that's it. I don't find this to be a problem, but I'm also aware I'm not an average user. Unfortunately this is about as good as it gets.

    There are also tools for doing basic error checking for HLSL, but be warned those will not work with Unity's shader files as it's expecting a specific file layouts and file extensions that Unity doesn't use. Plus it doesn't know how to parse Shader Lab's XML like structure to know where the HLSL shader code actually is. You're stuck with writing shaders, and tabbing back to Unity to see errors.
     
  7. MikeyJY

    MikeyJY

    Joined:
    Mar 2, 2018
    Posts:
    530
  8. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,336
    Visual Studio is going to be doing all of that based on the file extension. If you load up a .txt file or an extension it's unfamiliar with, it's not going to do any of that. That it just adding some features to .shader and probably .cginc files. It probably won't do anything to .hlsl files, which Unity's latest stuff uses instead of .cginc (they're both just straight HLSL), or .compute files.
     
    MikeyJY likes this.
  9. MikeyJY

    MikeyJY

    Joined:
    Mar 2, 2018
    Posts:
    530
    Why are UV represented by red-green-yellow-black image. I mean yellow is 255 green 255 red so it is between the 2. but the blue is missing.


    I also find UV represented by this image where green is missing and it is showing red, blue and purple, and again purple is 255 red and 255 blue.
    I couldn't find an image on google, but look at 2:07 on this video and you will see what I mean:


    That doesn't mean that there should be an image where red is missing and blue in the bottom-right, white in the up-right, green in the up-left and and an combination of blue and green in between?
     
  10. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,336
    A 2D texture's UV is just a float2 position coordinate. For Unity (0.0, 0.0) is the bottom left, (1.0, 1.0) is the top right, (0.5, 0.5) is the center. This is true regardless of the texture's resolution. Also note that on a repeating texture (what all textures default to being) (0.0, 0.0) and (1.0, 1.0) are actually the same position on the texture, as are (1.0, 0.0 and (0.0, 1.0). Understand these aren't the position of corner pixels, it's the position between the 4 corner pixels.

    Red / green is just an easy way to represent that 0.0 to 1.0 range. Understand the 0-255 range you're used to seeing for color values is just the raw byte value used to represent a floating point value between 0.0 and 1.0 for 24 bit color (8 bits per RGB color channel) values. The shader never sees or writes "255", it's reading and writing "1.0" and the GPU is converting that floating point value to whatever the current render target's format requires to store that.

    So, for normal texture UVs, there's no blue because there's only an x and y coordinate, so the blue is left at 0.0. The above example is representing the raw screen coordinates, which is a float4, and one which all 4 components have meaningful data stored. It's not just a UV, in fact the raw screen coordinates are not a UV at all. It's a specially encoded 4 dimensional position that you can use to calculate a normalized screen space position in a perspective correct way.

    ... try not to worry about that right now as that's getting deep into perspective transforms and homogeneous clip space.
     
  11. MikeyJY

    MikeyJY

    Joined:
    Mar 2, 2018
    Posts:
    530
    Thanks. THat's the last question on this thread