Search Unity

Resolved Decal/Projection Spotlight Shader And Vertex Transformations

Discussion in 'Shaders' started by AcChosen, Sep 8, 2020.

  1. AcChosen

    AcChosen

    Joined:
    Aug 19, 2018
    Posts:
    2
    Hello all!~

    Okay, so disclaimer: I'm dumb and don't really understand coordinate spaces in shaders apparently.

    So what I have created here is an intersection based projection/decal shader that that uses a cone mesh to simulate the effect of a spotlight using the depth buffer and calculated world normals. As you can see it works very well in world space when transformed and moved around in the scene with the transform component:



    The issue arises when I attempt to make these same transformations in local space using vertex rotation matrices:



    Essentially, the issue is, I have applied a series of vertex transformations in the form of rotation matrices to the vertices of the cone mesh that makes the projection. This works perfectly fine; all transformations of the vertices are done in object/local space as intended. The problem is that the projection does not seem to want to follow these transformations, despite them being calculated before the projection process even begins in the vertex shader (using a ray). I believe the problem lies in the vertex shader, specifically when transferring across spaces as any attempts at making the same transforms in the fragment shader warps the image dramatically.

    Here is a simplified rundown of what is happening the vertex shader.


    Code (CSharp):
    1.  
    2. v2f vert(appdata v){
    3.                 v2f o;
    4.                 //Calculate rotations for verts. A bunch of rotation matrix stuff happens in this function. The input is any float4.
    5.                 v.vertex = CalculateRotations(v.vertex);
    6.  
    7.                 //Send copy of verts to clip space
    8.                 o.vertex = UnityObjectToClipPos(v.vertex);
    9.  
    10.                 //get screen space position of verts
    11.                 o.screenPos = ComputeScreenPos(o.vertex);
    12.  
    13.  
    14.                 //Converting the transformed vertex locations into view space to generate
    15.                 //a ray to them from the camera.
    16.                 o.ray = UnityObjectToViewPos(v.vertex).xyz;
    17.  
    18.                 //invert z axis so that it projects from camera properly
    19.                 o.ray *= float3(1,1,-1);
    20.                 }
    Things I have tried:
    1) Applying the transforms in view space to the ray. The image simply becomes distorted and still does not show any signs of following the vertex transforms

    2) Applying the transforms to the pixel shader. This also distorts the image, resulting in an almost particle-like effect of the image. The image still does not show any signs of following the vertex transforms during this distortion.

    3) Disabling batching. No change occurs when batching is disabled, even when there are multiple copies of the object/shader/material.

    4) This is the most promising solution, however, still problematic. I store a copy of the verts before I apply the rotation transforms and send that copy to view space.



    Here is a simplified version of that code:

    Code (CSharp):
    1. v2f vert(appdata v){
    2.                 v2f o;
    3.  
    4.                 //Store vertex location before caluclating rotation.
    5.                 float4 pretransverts = v.vertex;
    6.  
    7.                 //Calculate rotations for verts. A bunch of rotation matrix stuff happens in this function. The input is any float4.
    8.                 v.vertex = CalculateRotations(v.vertex);
    9.                 //Send copy of verts to clip space
    10.                 o.vertex = UnityObjectToClipPos(v.vertex);
    11.                 //get screen space position of verts
    12.                 o.screenPos = ComputeScreenPos(o.vertex);
    13.                 //Putting in the vertex position before the transformation seems to somewhat move the projection correctly, but is still incorrect...?
    14.                 o.ray = UnityObjectToViewPos(pretransverts).xyz;
    15.                 //invert z axis so that it projects from camera properly
    16.                 o.ray *= float3(1,1,-1);
    17.                 }
    To restate the goal here: I want to transform the projection in local space in the same manner that I can in world space with the transform component using my matrix transformations.

    Yes, I am aware that it would be easier to do the transformations in world space with C#, though due to this being for a certain Unity VR application that currently only allows basic shader code with forward rendering, my hands are tied and this is the best method I've been able to come up with...

    I'm honestly at a loss here. I feel the answer is very simple but my lack of understanding of how the matrix transforms work in the different spaces is now hindering me. Any answers/guidance/hints would be welcomed here. I am also willing to post the code for the
    CalculateRotation(float4 input)
    function for clarification of what rotation matrices I'm using, as well as posting the rest of the fragment shader as well if needed.



    Sorry for the long post and thank you in advanced!
     
  2. AcChosen

    AcChosen

    Joined:
    Aug 19, 2018
    Posts:
    2
    Never mind we figured it out. :)
    Just needed to apply the inverted versions of the rotation matrix to the calculated intersection position in object space in the fragment shader. Here's the code for the inversion:


    Code (CSharp):
    1.            float4 InvertRotations (float4 input)
    2.             {
    3.                 float sX, cX, sY, cY;
    4.                 sX = sin(radians(_FixtureRotationX));
    5.                 cX = cos(radians(_FixtureRotationX));
    6.                 float4x4 rotX = float4x4(1, 0, 0, 0,
    7.                     0, cX, sX, 0,
    8.                     0, -sX, cX, 0,
    9.                     0, 0, 0, 1);
    10.                 sY = sin(radians(_FixtureBaseRotationY));
    11.                 cY = cos(radians(_FixtureBaseRotationY));
    12.                 float4x4 rotY = float4x4(cY, sY, 0, 0,
    13.                     -sY, cY, 0, 0,
    14.                     0, 0, 1, 0,
    15.                     0, 0, 0, 1);
    16.                 float4x4 combinedRot = mul(rotX, rotY);
    17.                 input = mul(combinedRot, input);
    18.                 return input;
    19.             }
     
    Last edited: Sep 9, 2020