Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Voting for the Unity Awards are OPEN! We’re looking to celebrate creators across games, industry, film, and many more categories. Cast your vote now for all categories
    Dismiss Notice
  3. Dismiss Notice

Vertex Shader Input UV1

Discussion in 'Shaders' started by HWDKoblenz, Jun 30, 2018.

  1. HWDKoblenz

    HWDKoblenz

    Joined:
    Apr 3, 2017
    Posts:
    19
    Hello everyone,

    for my project there is a last question (i hope so). At the moment I have a quite simple sphere and for each vertex i also have a Vector2 for it's curvature direction.

    My first try was to get these direction into my deferred light shader using the second often unused TEXCOORD1.

    In my C# program i calculate all curvatures and put them into a Vector2 List /Array.

    Code (CSharp):
    1. List<Vector2> BuffCurvature = new List<Vector2>();
    2.  
    3. //..calc Curvature and put Vectors into List...
    4.  
    5. //try to set UV1 with:
    6. mymesh.SetUVs(1, BuffCurvature);
    7.  
    8. //OR:
    9. mymesh.uv2= BuffCurvature.ToArray();
    10.  
    I printed the UV1 channel and the data is correct. *happy*

    So now i was quite naive and thought that I can access it in my Shader with:

    float2 curves : TEXCOORD1;

    I read it here: http://wiki.unity3d.com/index.php/Shader_Code

    Code (CSharp):
    1. struct VertexData
    2.             {
    3.                 float4 vertex : POSITION;
    4.                 float3 normal : NORMAL;
    5.                 float2 curves : TEXCOORD1;
    6.             };
    7.  
    8.             struct Interpolators
    9.             {
    10.                 float4 pos : SV_POSITION;
    11.                 float4 uv : TEXCOORD0;
    12.                 float2 curves : TEXCOORD1;
    13.                
    14.             };
    Then passing them to the fragment program... but it seems that the float2 curves it empty.

    Where is my mistake? :-D Can you help me?
     
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,229
    Are you copying the data over from the vertex data structures to interpolator struct in the vertex function?
     
  3. HWDKoblenz

    HWDKoblenz

    Joined:
    Apr 3, 2017
    Posts:
    19
    Hello bgolus again :-D. Thanks for your support.

    To your question: Yes I'm passing these coordinates in my vertex programm:

    Inside my Deferred Light Shader.

    Code (CSharp):
    1. Interpolators VertexProgram(VertexData v)
    2.             {
    3.                 Interpolators i;
    4.                [B] i.texcoord1 = v.texcoord1;[/B]
    5.                 i.pos = UnityObjectToClipPos(v.vertex);
    6.                 i.texcoord = ComputeScreenPos(i.pos);
    7.                 i.vertex_w = mul(unity_ObjectToWorld, v.vertex);
    8.                 //wird für distanz zum fragment gebraucht.
    9.                 i.ray = v.normal;
    10.  
    11.                 return i;
    12.             }
    I changed the name to texcoord1 instead of curves.

    Another idea would be:
    1)Passing the curvature array/list to the attached material of my object (Maybe you know a good practice for this)
    2) Fill one GBuffer Texture and then use it in my Light Shader.

    Thank you :).
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,229
    I honestly can't think of any reason why what you're doing wouldn't work. My only guess is it's a bug somewhere in your code on the C# side ... like you're modifying a mesh, but not assigning it onto the mesh filter? The second UV might also be getting blown away if the mesh is static or otherwise batched as the secondary UVs are used for light mapping. Batching also seems to sometimes strip unused vertex data from a mesh if the default material assigned to it doesn't use that data. You could also store your curvature in the second two components of the first UV. As is your shader doesn't appear to be using the first UV channel at all, so you could alternatively just do SetUVS(0, curvature) and use TEXCOORD0 and see if that works.

    If you're using the UVs from the first UV set elsewhere, you can do something like this:

    List<Vector4> meshUVs0 = new List<Vector4>();
    myMesh.GetUVs(0, meshUVs0);
    // add curvature to the z and w components of each Vector4 in meshUVs0
    myMesh.SetUVS(0, meshUVs0);


    Then in the shader use:

    // vertex data struct
    float4 uv : TEXCOORD0;

    // vertex shader
    float curvature = v.uv.zw;
     
  5. HWDKoblenz

    HWDKoblenz

    Joined:
    Apr 3, 2017
    Posts:
    19
    Hello bgolus,

    thanks for your answer. Yes I already tried this, but it doesn't work. I'm also quite sure that the change of the uv oder uv2 on the C# side is working, because if i change the uv.xy the Textures are aligned different. Actually my workaround is:

    I use the COLOR of the Shaders to transport my curvature even in 3D.
    But I have to put this Vector into a GBuffer Texture inside the attached Deferred Shader. This was the only way to get my curvature to my Deferred Light Shader, to compute the rotation of my hatching textures.
     
  6. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,229
    Ah, I see. You did actually literally mean your deferred light shader ... yeah, by that point the scene mesh data is gone and the lights are just sampling full screen textures that have the data about the objects in view rendered into them. If you need more data from the mesh you need to render the data into a full screen texture and pass it along to the light shader, like you ended up doing.
     
  7. HWDKoblenz

    HWDKoblenz

    Joined:
    Apr 3, 2017
    Posts:
    19
    Yes I recognized this.... okay thanks, then I will compute my calculations inside the deferred shader and not in the light shader. Thank you very much !