Search Unity

Question Shader function same input, but different results?

Discussion in 'Shaders' started by NSDuo, Jul 28, 2022.

  1. NSDuo

    NSDuo

    Joined:
    Jul 7, 2018
    Posts:
    7
    Hello again Unity Forum,

    I've been working on a water system for my game, and it's going well, but I've run into a problem that I can't seem to work around.

    Basically, I have a shader function that gives different results when given the same input from different sources (weird right?)

    This is happening in my wave function that I have in an hlsl file and is included in the shaders for the water surface, the underwater surface, and the effect plane that is placed in front of the camera and handles underwater rendering. The wave function takes a noise texture as an offset uv for a gradient texture and looks like this:
    Code (CSharp):
    1. half colorCycle(sampler2D noise, sampler2D gradient, half speed, half offset, half2 uv)
    2. {
    3.     half noiseSample = tex2Dlod(noise, half4(uv, 0, 0)).r;
    4.     // I also implemented unity's gradient noise from shader graph as a noise source
    5.     //half noiseSample = gradientNoise(uv, 20);
    6.  
    7.     half timeModifier = frac((_Time.y + offset) * speed);
    8.     noiseSample += timeModifier;
    9.  
    10.     return tex2Dlod(gradient, half4(noiseSample, 0.5, 0, 0)).r;
    11. }
    This creates a flexible and decent looking wave effect that works well from a visual stand point. However, my problem occurs when I plug in the world space coordinates in each shader. For some reason I can't understand, giving the world x and z coordinates to the wave function for the three objects gives me seemingly close, but incorrect results.

    Here the wave on the effect plane is coming up too low on the left side, completely ruining the illusion. Sometimes it comes up higher than the water surface too.

    The effect plane determines where the water surface is with a point to plane operation, then adjusts the result with the colorCycle function to find the y offset.
    Code (CSharp):
    1. // calculate point-to-plane and find the water surface
    2.                 half waterLine = colorCycle(_WaveTexture, _WaveGradient, _CycleSpeed, _CycleOffset, i.positionWS.xz * _WaveTexture_ST.xy + _WaveTexture_ST.zw);
    3.                 half3 sdf = dot(i.positionWS, half3(0, 1, 0)) - (_EdgeAdjust + waterLine * _WaveHeight);
    4.                 clip(-sdf);
    In the water surface, the result of the colorCycle function is simply added to the y value of the vertex in the vertex shader.

    I figured that because the water and the effect plane are both calling the above colorCycle function with the world position of the fragments being passed in for the uv's, that the effect plane would at least be close to the water surface where it gets clipped.

    I wanted to try one more method that I thought of before asking for help here. I made a new scene and new shaders to replicate the functions of the other ones. In this scene I left the water surface the same, with the vertices adjusted in the vertex stage, but with the effect plane I took the world position again, created a new position with the colorCycle offset, then I compared the new position with the old position.

    This did not work. I knew that due to the resolution difference between performing calculations per vertex and per fragment, that there would texels that would not effect the vertex height, however:

    On closer inspection, I found that the colors generated on each object don't match the other. This is why the waves don't match up, but I'm confused as to why similar world positions can give wildly different results.

    So now I'm completely confused. I was absolutely sure that using world position coordinates would yield at least somewhat consistent results, as well as being able to account for rotation, but now I don't know how to make this work.

    I still feel that the point to plane method has the potential to work, as it has given me somewhat correct results (occasionally), but what can I do to get the correct height from the wave function?
     

    Attached Files:

  2. kruskal21

    kruskal21

    Joined:
    May 17, 2022
    Posts:
    68
    I am a bit curious about this line:
    half3 sdf = dot(i.positionWS, half3(0, 1, 0)) - (_EdgeAdjust + waterLine * _WaveHeight);
    I haven't fully grasped what this line does, but are you certain "dot(i.positionWS, half3(0, 1, 0))" is what you mean to do here? Dot product is usually used to calculate the similarity between two unit-length direction vectors, and to use a position vector here doesn't make sense in most cases.
     
  3. NSDuo

    NSDuo

    Joined:
    Jul 7, 2018
    Posts:
    7
    Yes, this is my point to plane operation. I learned about the method from this video:


    This line is what allows my effect plane to know where in world space the water plane is. And my idea is that the wave function, when given the world space x and z coords can raise the y value consistent with the similar wave function in the water surface object.
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,350
    This is equivalent to
    i.positionWS.y
    . Nothing particularly crazy there.

    I'd output the value you get back from the noise texture colorCycle to see if those line up. But I suspect the issue is the world y input is different and that's causing the confusion.
     
    Last edited: Jul 29, 2022
  5. NSDuo

    NSDuo

    Joined:
    Jul 7, 2018
    Posts:
    7
    I almost got it. Checking the outputs from the colorCycle, was where I was getting hung up on. I was looking only at the output in my test scene, but I neglected to check the inputs. After making the inputs consistent, I now have this:
    Screen Shot 2022-07-31 at 11.13.20 PM.png
    The way the effect plane blends with the water surface is exactly what I'm looking for. However, in my main scene, I double checked the inputs and they were all the same. And on a quick glance the code in the shaders looked correct as well.

    I think what I'm going to do next is view the wave output in my main shaders and see what happens.
     
  6. NSDuo

    NSDuo

    Joined:
    Jul 7, 2018
    Posts:
    7
    I think I figured it out. I set my shaders up to view the output of the colorCycle function in my main scene, and got everything lined up, but I was still getting gaps in the water line. I realized that the dip of the wave was going underneath the edges in-between the polygons of the water surface whenever the camera was at an oblique angle.

    One solution I came up with was to divide the wave height on the effect plane and raise the water height level a little bit, so that the gaps get covered up. What I think I'm going to do is implement tessellation or tiles with an LOD system to increase polygon resolution in the water.