Search Unity

Moving Atan2 and Length from Frag to Vert problem.

Discussion in 'Shaders' started by resetme, Sep 14, 2017.

  1. resetme

    resetme

    Joined:
    Jun 27, 2012
    Posts:
    204
    Hello,

    Im trying to optimize my shaders and i move mostly of my code and calculation into vertex pass and send the result to the fragment.

    The problem i have is atan2 and length do not work properly at all. Sin, cos, tan etc work well.

    what is the problem? why those 2 calculations inside the vertex give me problems?


    simple code

    Code (CSharp):
    1.  
    2. float4 frag(VertexOutput i) : COLOR {
    3.  
    4.                 float2 subtract = ((float2(_sX,_sY)*(float2(_X,_Y) + i.uv0))*2.0+-1.0);
    5.                 float2 pl = (float2(length(subtract.xy),((  atan2(subtract.y, subtract.x)/6.28)+0.5)) + float2((_Time.y*_Speed),(_Time.y*_Rotation)));
    6.  
    7.                 float4 _BaseTEX = tex2D(_Base, pl);
    8.  
    9.                 float3 emissive = _BaseTEX.rgb;
    10.  
    11.                 return fixed4(emissive,1);
    12.             }
    13.  
    now if i do

    Code (CSharp):
    1.  
    2. VertexOutput vert (VertexInput v) {
    3. float2 subtract = ((float2(_sX,_sY)*(float2(_X,_Y) + o.uv0))*2.0+-1.0);
    4.                 float2 pl = (float2(length(subtract.xy),((atan2(subtract.y, subtract.x)/6.28)+0.5))+float2((_Time.y*_Speed),(_Time.y*_Rotation)));
    5.  
    6. o.fx = pl;
    7.  
    8. // inside Frag
    9. float4 _BaseTEX = tex2D(_Base, i.fx);
    10.  
    11.  
    The UV effect is broke.

    Help!
     
  2. Marco-Sperling

    Marco-Sperling

    Joined:
    Mar 5, 2012
    Posts:
    620
    Is this the complete vertex shader?
    You are not intializing the output struct to any default values nor are you assigning anything meaningful to o.uv0 before actually using it.
    Your VertexInput struct v should hopefully contain something like v.texcoord - that's typically your vertex uv coordinate.
     
  3. resetme

    resetme

    Joined:
    Jun 27, 2012
    Posts:
    204
    hi there, thank you for taking your time.

    No, is not the complete vertex shader.

    i do initilize the o struct and then send the calculation inside the o.fx (half4 texture 2)

    i take the value inside the pixel shader i.fx (i put my rotation, scroll, polar and other effect inside X Y Z W)
    Everything works like a sharm but the values of my polar cordinates effects are wrong (atan2 and lenght).

    So to fix the polar cordinate effectt i put the atan2 and lenght inside the pixel shader and everything is back to normal.

    Right now for optimization the polar effect im doing

    float2 subtract = ((float2(_sX,_sY)*(float2(_X,_Y) + o.uv0))*2.0+-1.0); // INSIDE VERTEX
    float2 movrot = float2((_Time.y*_Speed),(_Time.y*_Rotation))); // INSIDE VERTEX
    o.fx = half4(subtract,movrot);

    Then inside Pixel
    float2 pl = (float2(length(i.fx.xy),((atan2(i.fx.y, i.fx.x)/_PolarAmount)+ i.fx.zw;

    It work and im using it like this but..

    why i can not do the same inside the vertex?
    length(subtract.xy) // This inside the Vertex give me strange values
    atan2(subtract.y, subtract.x) // This inside the Vertex give me strange values

    i would like to have this inside my pixel shader
    FROM float2 pl = (float2(length(i.fx.xy),((atan2(i.fx.y, i.fx.x)/_PolarAmount)+ i.fx.zw;
    TO float2 pl = (float2(i.fx.x,((i.fx.y /_PolarAmount)+ i.fx.zw;


    It feels like the atan and lenght have some relations with the 4 vertex of my simple mesh plane and it causes my uv not to calculate the full polar coordinate effect.

    sorry for my bad explanation.
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Both length and atan2 give you exactly the same results in the vertex shader and fragment shader. The problem you're having is those values cannot be linearly interpolated. When you do any kind of calculation in the vertex shader it will be perfectly accurate for that vertex, but the fragment shader gets only an interpolated value that is the barycentric linear interpolated result from three vertices.

    Lets think of a simple case of calculating the length of a value along a line. Imagine we have 4 vertices in a row with these values.
    1.5
    0.5
    -0.5
    -1.5

    That places the 0.0 half way between the two center vertices. If we take any point along the line we can get the length() of that value and get an accurate result for the distance from that point to the middle of the two center vertices.

    If we do that same length() in the vertex shader before passing it to the fragment shader then we get this.
    1.5
    0.5
    0.5
    1.5

    Between the outer two pairs of vertices the length is still accurately interpolated, but between the two middle vertices every single point gets the value 0.5! You no longer know the distance to that center position!

    If you expand this out to 2d or 3d space it gets worse. In the example of the line the distance is always linear, so while the area between the middle two vertices will give the wrong interpolated value, all the other positions are fine. In 2D space this doesn't hold true.

    Lets say you have two vertices with these values.
    (0,4)
    (3,4)

    The length for those are 4 and 5 respectively. A point half way between those would give an interpolated value of (1.5, 4) which has a length of 4.272, not 4.5!

    The atan2 function has an additional problem. It wraps around at some point. In the fragment shader this results in a hard discontinuity between two pixels, similar to frac() or a modulo.
    ... 0.7, 0.8, 0.9, 0.0, 0.1, 0.2 ...
    When this break happens between two pixels there's no real problem. When this break happens between vertices now you're interpolating between that 0.9 and 0.0!
     
    jvo3dc likes this.
  5. Marco-Sperling

    Marco-Sperling

    Joined:
    Mar 5, 2012
    Posts:
    620
    Nice explanation, bgolus.
    Depending on the accuracy needed it might still be possible to use the vertex shader for the above calculations and simply throw a higher tessellated mesh at it - the errors might be barely noticeable when the mesh has just enough vertices to sort of hide the artifacts. Or wouldn't it?
     
  6. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    It would help but there's not really a good reason to do that. Unless you tesselate the mesh to the point of being one vertex per pixel the wrap on the atan2 will always be obvious. If you are tessellating to almost one vertex per pixel then it's significantly more expensive to render than not tessellating and doing everything in the fragment shader. For length it'll be a bit better off, depending on how you use the data.

    It's basically the old gouraud vs phong decision.

    http://www.csc.villanova.edu/~mdamian/Past/csc8470sp15/notes/shading.htm

    Per vertex can be faster, but you have to contend with the artifacts. Gouraud specular highlights are limited by the mesh resolution; the entire highlight spot might be smaller than the area between vertices. Phong is more expensive, but is only limited by the pixel resolution which is usually better than the vertex resolution.

    One extra benefit of doing the expensive calculations in the fragment shader over the vertex shader is every vertex of a mesh being drawn has to be calculated if any part of the mesh is seen, even if those extra vertices aren't visible. Every pixel not on screen or occluded by another object is not calculated! That means on a highly tessellated sphere you might be calculating more vertices than pixels!


    As side tangent, Reyes rendering is basically just that idea of tessellating a mesh until it's one vertex per pixel. ILM and Pixar used this for several decades when CGI first started being used and only really moved away from it fully in the last 5 years. It has benefits for depth sorting and anti-aliasing which have been solved (or ignored) in different ways for real time rendering. When every triangle is about the size of a pixel you can sort the triangles before rendering and get perfect depth sorting. When it takes 48 hours to render a frame spending an hour or two tessellating and sorting polygons wasn't a big deal. Today it wouldn't take that long to tessellated and sort, but still longer than the 1/30th of a second we have!
     
  7. resetme

    resetme

    Joined:
    Jun 27, 2012
    Posts:
    204
    Thank you all for the answer, i did more test but distorting the uv using polar coord functions inside vertex is a no go right now for me.

    i will buy a coffee as my totem of appreciation.
     
  8. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Yes, sorry. That was kind of the main point of my original post. You can't use those kinds of functions in a vertex shader without having visual problems. I got side tracked by @Marco-Sperling 's question of if increasing the mesh resolution fixes the problem. The short answer being yes, but it's way slower than doing it in the fragment shader.

    TLDR; You should be doing it in the fragment shader. What you're doing in your second post is the correct solution.