Search Unity

Can I get the scale in the transform of the object I attach a shader to? If so, how?

Discussion in 'Shaders' started by MagicCancel, Jul 26, 2016.

  1. MagicCancel

    MagicCancel

    Joined:
    Jul 30, 2015
    Posts:
    25
    Preface: I am pretty damn new to shaders.

    Hello, I'm trying to write a shader that will make a tiled effect with the graphic based on the scale found in the transform of the object. I got this effect going by making a shader with editable values to represent scale and attached a script that would constantly update those values based on the objects scale, but it seemed this caused a new material to be created all the time. I don't think that's a good thing, so I'm wondering if it's possible to get the scale from the shader code, that way all objects that use this effect can use that shader without needing to edit any values inside.

    If this is possible and someone can explain how it would be greatly appreciated!
     
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Yes, this is possible ... sort of. It's also very expensive to do it.

    Here's the basic code, for Unity 5.4
    Code (CSharp):
    1. float3 worldScale = float3(
    2.     length(float3(unity_ObjectToWorld[0].x, unity_ObjectToWorld[1].x, unity_ObjectToWorld[2].x)), // scale x axis
    3.     length(float3(unity_ObjectToWorld[0].y, unity_ObjectToWorld[1].y, unity_ObjectToWorld[2].y)), // scale y axis
    4.     length(float3(unity_ObjectToWorld[0].z, unity_ObjectToWorld[1].z, unity_ObjectToWorld[2].z))  // scale z axis
    5.     );
    For Unity 5.3 and prior replace the unity_ObjectToWorld with _Object2World

    Generally this will be similar to the transform.lossyScale, but not always. Sometimes the mesh itself has a hidden scale from import that Unity doesn't show on the gameObject and corrects for in the unity_ObjectToWorld transform.

    Now having multiple materials isn't great, but also isn't really that bad. The main "bad" thing about them is it breaks batching, but batching also breaks getting the mesh's scale in the shader anyway since with batching multiple meshes are combined into a single mesh and each original sub mesh's scale is going to be pre-applied. If this is a value you're changing frequently you can try looking into using MaterialPropertyBlocks. I've written about them elsewhere; try searching for MaterialPropertyBlock on the forums and some useful stuff might show up.
     
    Wappenull, atomicjoe and Trungdv like this.
  3. MagicCancel

    MagicCancel

    Joined:
    Jul 30, 2015
    Posts:
    25
    Thanks for the info, I'll look into that MaterialPropertyBlock to see if it help with my situation. To further explain my goal, the idea was to be able to make scaled actors that could fill parts of levels that would just be a repeating tile effect without supersaturating the actor list. In editor mode, the scale values could be changing constantly, but in game mode they should never change once. That make any sense?
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Depending on your use case world space texture mapping / triplanar mapping might work for you instead.
     
  5. tsangwailam

    tsangwailam

    Joined:
    Jul 30, 2013
    Posts:
    280
    @MagicCancel If you don't use the vertex color, you can write the calculated uv into the mesh colors. So, you don't need to access the material and do the expansive calculation in the shader. But this only apply if you have one mesh per material.
     
  6. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
  7. tsangwailam

    tsangwailam

    Joined:
    Jul 30, 2013
    Posts:
    280
    Recently i am doing similar things. Face the same problems of @MagicCancel. Try calculated the uv with bounds inside shader, but it is expensive.

    Last, i attached a script to set the calculated uv to the vertex stream when run. Since the uv will not change after set, it is not worth to calculate inside shader.
     
  8. MagicCancel

    MagicCancel

    Joined:
    Jul 30, 2015
    Posts:
    25
    I'm back and I went about trying to make a shader that worked off global position. I got frusterated and looked to see if anyone already made one and found this:

    Code (CSharp):
    1. Shader "Sprites/WorldTile"
    2. {
    3.     Properties
    4.     {
    5.         _MainTex ("Texture", 2D) = "white" {}
    6.     }
    7.     SubShader
    8.     {
    9.         Tags { "RenderType"="Opaque" }
    10.         LOD 100
    11.  
    12.         Pass
    13.         {          
    14.             CGPROGRAM
    15.             #pragma vertex vert
    16.             #pragma fragment frag              
    17.             #include "UnityCG.cginc"
    18.  
    19.             struct appdata
    20.             {
    21.                 float4 vertex : POSITION;
    22.                 float2 uv : TEXCOORD0;
    23.             };
    24.  
    25.             struct v2f
    26.             {
    27.                 float2 uv : TEXCOORD0;
    28.                 float4 vertex : SV_POSITION;
    29.             };
    30.  
    31.             sampler2D _MainTex;
    32.             float4 _MainTex_ST;
    33.  
    34.             v2f vert (appdata v)
    35.             {
    36.                 v2f o;
    37.                 o.vertex = mul(UNITY_MATRIX_MVP, v.vertex);
    38.  
    39.                 // Gets the xy position of the vertex in worldspace.
    40.                 float2 worldXY = mul(_Object2World, v.vertex).xy;
    41.                 // Use the worldspace coords instead of the mesh's UVs.
    42.                 o.uv = TRANSFORM_TEX(worldXY, _MainTex);
    43.  
    44.                 return o;
    45.             }
    46.  
    47.             fixed4 frag (v2f i) : SV_Target
    48.             {                  
    49.                 fixed4 col = tex2D(_MainTex, i.uv);
    50.                 return col;
    51.             }
    52.             ENDCG
    53.         }
    54.     }
    55. }
    But I think it's only getting 0,0 for the world coordinate? Can anyone help me with this?
     
  9. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Nothing looks wrong that I can see. I don't know what orientation your meshes are in, so you might need to try .xz or .yz and see if those show you anything.
     
  10. ifurkend

    ifurkend

    Joined:
    Sep 4, 2012
    Posts:
    350
    @bgolus: So it's more performance-sound to just write a C# script to get the transform.scale value and set it to the desired shader properties of material. Is that it?
     
  11. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Probably. The scale you would get in the shader won't necessarily match transform.scale or even transform.lossyScale which may or may not be a good thing. The scale you get in the shader from the object to world matrix would be more akin to:

    Vector3 scale = renderer.localToWorldMatrix.MultiplyVector(Vector3.one);
     
  12. bitinn

    bitinn

    Joined:
    Aug 20, 2016
    Posts:
    961
    Sorry to invoke this old thread: but could you explain why this is very expensive, besides the obvious length calculation?

    Or was it relatively speaking? because the scale value is a uniform anyway, we should just pass it via property?

    If we do this in vertex stage and use the shader on basically a quad, is it so bad? (I assume this is the primary reason why people want this value: to tile something but also ignore world-space position/rotation/non-uniform scaling...)
     
  13. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    A) This was in the context of 4 years ago, and I was doing early mobile & console VR at the time, plus thinking about trying to support low end GPUs that are now 9 years old (I usually think about PC support as being a 5 year window).
    B) It was really just the three
    length()
    calls themselves. It's totally fine to do today, especially if you're just doing it in the vertex shader.
     
    sewy and bitinn like this.
  14. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    object scale! (in case someone finds this old thread through google like I did)

    Code (CSharp):
    1. half3x3 m = (half3x3)UNITY_MATRIX_M;
    2. half3 objectScale = half3(
    3.     length( half3( m[0][0], m[1][0], m[2][0] ) ),
    4.     length( half3( m[0][1], m[1][1], m[2][1] ) ),
    5.     length( half3( m[0][2], m[1][2], m[2][2] ) )
    6. );
     
    Last edited: Jun 30, 2020
  15. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    That compiles to the same shader as the code I posted above, just using a different notation for selection matrix elements.
    UNITY_MATRIX_M
    is a macro that redirects to
    unity_ObjectToWorld
    , and
    matrix[0][0]
    is the same as
    matrix[0].x
    , which is the same as
    matrix._m00
    , which is the same as
    matrix._11
    .

    These days I use this bit of code:
    Code (csharp):
    1. float3 scale = float3(
    2.     length(unity_ObjectToWorld._m00_m10_m20),
    3.     length(unity_ObjectToWorld._m01_m11_m21),
    4.     length(unity_ObjectToWorld._m02_m12_m22)
    5. );
    But, again, they're all the same in the compiled shader.
     
  16. Nyphur

    Nyphur

    Joined:
    Jan 29, 2016
    Posts:
    98
    Found this old thread through google, it works perfectly to get the object scale but doesn't work for batched modelsas it returns the scale of the entire batch. Attached is a screenshot of the function used in a shader that's designed to keep textures scale invariant, with 3 of the same object at different scales nested inside each other to demonstrate. As far as I know, the only way to solve this is to manually insert the mesh world scale into a vertex data channel before batching and use that in the shader.
    batched.jpg
     
  17. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Yep, batching breaking it is mentioned in the second post. Really, if you're getting to the point of modifying the mesh data, might as well directly modify the mesh UVs so you don't need a special shader.
     
  18. Nyphur

    Nyphur

    Joined:
    Jan 29, 2016
    Posts:
    98
    That'd be optimal in most cases but there are use cases it's not suitable for. I blend a scale-invariant detail texture with a standard base texture, for example, so I need both the standard UVs and a scale-invariant set. A shader-based approach should also work with arbitrary realtime batching, changing scale at runtime, and should work with existing tools out of the box.
     
  19. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Dynamic batching has the same problem though. You’d have to edit the mesh data and manually batch via scripting.

    The real universal solution is to use a shader that uses world space triplanar UVs. Those will work on everything.
     
  20. Nyphur

    Nyphur

    Joined:
    Jan 29, 2016
    Posts:
    98
    Triplanar mapping will work on everything, but it'll produce very different results than worldscaled UVs, and can have worse performance and ugly blending regions.

    What I ended up doing is editing the batching tool I use (Mesh Combiner Studio) to bake a second set of worldscaled UVs into the UV4 channel as it combines the meshes, then I built a script to automatically find uses of my shader in the batched models and swap it for a variant that uses UV4 for the detail texture. Now it produces identical results and is compatible with batching.
     
  21. MaT227

    MaT227

    Joined:
    Jul 3, 2012
    Posts:
    628
    It seems that this produce unsigned scaled because of the length calculation but is there a way to get the signed scale ? I achieved to do it but when rotating the mesh it breaks.
     
    FariAnderson likes this.
  22. Feral_Pug

    Feral_Pug

    Joined:
    Sep 29, 2019
    Posts:
    49
    Does rotation not get stored in those cells of the matrix as well? I thought these matrices were position, rotation, and scale matrices multiplied together (in some order). Or is that not actually how they work? Like, wouldn't rotation affect this value, or does the length() take care of that?
     
    Last edited: Mar 17, 2021
  23. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Yes, rotation, scale, and translation. Translation sits by itself, but the scale and rotation are represented by the 3x3 of the matrix. The 3x3 part of the matrix is basically 3 scaled vectors that give the axis and magnitude of the x, y, and z coordinates for the space you're transforming from in the space you're transforming to.
     
  24. Feral_Pug

    Feral_Pug

    Joined:
    Sep 29, 2019
    Posts:
    49
    Ah, okay. That is what I thought. I think I see why this works. Because each of these scaled vectors (x, y, and z axis), when scale is 1, are normal vectors, have length of 1. So by taking the length of each of these you are figuring out how long they are, which would be equal to their scale. Right?
     
  25. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
  26. FariAnderson

    FariAnderson

    Joined:
    Jan 20, 2020
    Posts:
    37
    my problem too, how to find signed scale ?
     
  27. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    The short version is ... you can't.

    The long version is you can find out if the scale has an odd number of negatively scaled axis by doing a dot product between one axis vector the cross product of the other two axes. But there's no way to tell the difference between a transform matrix with two negatively scaled axes and one with no negatively scaled axes that's been rotated 180 degrees on two axis. They produce the same transform matrix. Similarly you can't tell the difference between any one or all three negatively scaled axes as certain rotations with those scales can produce the same transform matrix. If you really need to know the sign of the scale you need to get this in c# and pass it to the material manually.
     
    FariAnderson likes this.