Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We have updated the language to the Editor Terms based on feedback from our employees and community. Learn more.
    Dismiss Notice

Shader Graph - Getting local sprite UV from sprite sheet

Discussion in '2D' started by mbitzos, Apr 12, 2020.

  1. mbitzos

    mbitzos

    Joined:
    Jun 8, 2018
    Posts:
    17
    When making a shader in shader graph that is being applied to a sprite sheet texture, I often want to do things with the UV. The problem is that the UV coords (0-1) are for the entire sprite sheet as opposed to just the local uv coords for the current sprite that is being displayed.

    Example:
    Create a shader that simply clips the sprite at the halfway point (only displaying half of it). I would simply just take the UV than add a step node and multiply that by the _MainTex alpha channel.

    Does anyone know if Shader Graph (or unity 2019 in general) has a built in function/node for this?
    I know a common method is to just calculate this yourself giving the sprite rect information then passing that to the shader via properties. I was hoping maybe i wouldn't have to do this every time i wanted a shader with local UV logic applied to sprite sheet textures since it can get tedious.

    Thanks!
     
  2. Tom-Atom

    Tom-Atom

    Joined:
    Jun 29, 2014
    Posts:
    153
    I had similar problem when doing this: http://sbcgames.io/sword-fortune-wip-waving-foliage-shader/
    In the end I encoded needed information into secondary texture (in my case into red and green channels).
    You can encode there distance from bottom in range 0-1 (0 bottom of your sprite and 1 for top). Then you can sample this texture at the same UV as for main texture to get how high is current fragment above bottom.
     
  3. mbitzos

    mbitzos

    Joined:
    Jun 8, 2018
    Posts:
    17
    Interesting, so for the secondary texture do you generate this via a c# script on runtime or is this something that is done manually?
     
  4. Tom-Atom

    Tom-Atom

    Joined:
    Jun 29, 2014
    Posts:
    153
    It is generated by helper C# script, but not in runtime. With helper script I create texture which I import as asset and use it as secondary texture for sprite. It is ready to work with atlases/spritesheets. Unfortunately, atlasing of secondary textures is not supported by Unity yet - they plan it for 2020.2 (https://forum.unity.com/threads/2019-2-0b1-secondary-textures-spriteatlas.676213/)

    To create and save texture I use something like this (with example code below I created texture, that encodes sin/cos values for fast reading in fragment shader - just see, how to access pixels and create whatever you need):

    Code (CSharp):
    1.     public void CreateSinCosTexture() {
    2.  
    3.         int width = 1024;
    4.  
    5.         Texture2D tex = new Texture2D(width, 1, TextureFormat.RGBA32, false);
    6.  
    7.         for (int i = 0; i < width; i++) {
    8.  
    9.             float angle = (float)i / width * 360;
    10.             float rad = angle * Mathf.Deg2Rad;
    11.  
    12.             //Debug.Log($"{i}: angle = {angle}, rad = {rad}");
    13.  
    14.             float sin = Mathf.Sin(rad);
    15.             float cos = Mathf.Cos(rad);
    16.  
    17.             sin = (sin + 1) / 2;
    18.             cos = (cos + 1) / 2;
    19.  
    20.             Color32 color = new Color32((byte) (Mathf.RoundToInt(sin * 255) & 0xFF), (byte) (Mathf.RoundToInt(cos * 255) & 0xFF), 0, 255);
    21.  
    22.             tex.SetPixel(i, 0, color);
    23.         }
    24.  
    25.         // Encode texture into PNG
    26.         byte[] bytes = tex.EncodeToPNG();
    27.         DestroyImmediate(tex);
    28.  
    29.         // write to a file
    30.         File.WriteAllBytes("D:" + Path.DirectorySeparatorChar + "SinCos.png", bytes);
    31.     }
     
    bearcoree and mbitzos like this.
  5. Xiromtz

    Xiromtz

    Joined:
    Feb 1, 2015
    Posts:
    65
    If you know your sprite size und count (i.e. via a script or hardcoded), you should be able to simply calculate the UV coordinates for the corresponding sprites. You can calculate these in the awake function or something like that and there won't be much overhead at runtime.

    I did this for a rain animation that was fixed to the Camera. I made a shadergraph with a UI shader, so if you're using a sprite shader, it might be different for you.

    For example if you have 10x10 square sprites with the same size, the first sprite will be x(0-0.1),y(0-0.1) and so on. With different sized textures you'll have to hardcode a bit more stuff, but it should be possible.

    Although it seems like a lot of work to get the pipeline going, the secondary texture solution looks nicer, though you'd need to create some tool to create those secondary textures imo. Otherwise, hardcoding is probably less work.
     
    Deleted User and mbitzos like this.
  6. mbitzos

    mbitzos

    Joined:
    Jun 8, 2018
    Posts:
    17
    Thanks for posting the code i understand now how you are doing this!
    Interesting solution that looks like it can be pretty easily integrated into an automatic sprite importer or however you do it.

    Definitely will look into this!
     
  7. mbitzos

    mbitzos

    Joined:
    Jun 8, 2018
    Posts:
    17
    Yeah I saw that the Shader Graph actually has a node called "Flipbook" that i think does what you are saying (splits spritesheet into even segments given dimensions) My main problem with this is that the sprite sheets i use most likely wont be exact tiles (different dimensions). But that would be a good solution for tile based sprites.

    Thanks for the reply!
     
  8. Xiromtz

    Xiromtz

    Joined:
    Feb 1, 2015
    Posts:
    65
    I'm not 100% sure what your issue is at this point.
    Are you trying to apply algorithms within the shader that would assume a UV 0-1 and using fractions would be a problem? Or is it something different?

    You could, for example, add a script that contains the texture and information about each sprite within it. The information would be the min-max range of every single sprite. If you input this range to the shader, it can normalize the values to be 0-1 and then correctly apply the algorithms.
    Just an idea.
     
  9. mbitzos

    mbitzos

    Joined:
    Jun 8, 2018
    Posts:
    17
    Yeah pretty much what I am trying to do.

    I've done this solution before with a script that just feeds the shader the current sprite texture information. It certainly isn't hard and its manageable. (Tom-Atom's solution is also very elegant!)

    I was just wondering if this was already a built in feature as it doesn't really make sense to me why this isn't already a supported feature in the SpriteRenderer component as I would imagine the sprite renderer already has the necessary information to do this.

    I appreciate the reply!
     
  10. Tom-Atom

    Tom-Atom

    Joined:
    Jun 29, 2014
    Posts:
    153
    Problems I see here are:
    - if you use atlas, you never know, where will be sprite placed. You would have to either use your atlas solution or somehow hook to Unity to check when atlas is rebuilt and then update your sprite metadata,
    - inputing sprite metadata into shader would probably lead to breaking batches - if you pass it as uniforms on per sprite basis. If you do not want to break batch, then you have to pass it as per vertex data, but there is problem, that SpriteRenderar does not allow you to pass additional data. You can only do "hack" and encode some additional vertex extra data into rotation X, Y or position Z, because we are in 2D. You of course have to clear this in vertex shader. Secondary texture is another "hack" - you can pass per fragment data with it and more, it is fast as there is no extra calculation, but only texture lookup.

    I think, that SpriteRenderer has no information about position within texture other then UV on vertices. Problem is, that in vertex shader, each vertex is processed individually. And you do not know, whether it is vertex on top or at bottom or in the middle. You cannot ask for sprite's topmost vertex, etc. And vertex shder does not even need it. So, if you wanted access some sprite "bounds" in vertex shader, you can either pass it as uniform ... but it will break batch or pass it as additional per vertex data (with problems described above) - then the bounds are included into each vertex and you can access it in vertex shader.
    Even if you passed extra vertex data into vertex shader, there is currently problem, that Shader Graph does not support custom varyings (interpolators for passing information from vertex into fragment shader). But you can use them if writing shader in code. It is big difference, if you calcualte something in vertex shader or in fragment shader. If you sprite is two triangles, then it vertex shader has to process 6 vertices and can pass interpolated result into fragment shader. If you do the same calculation on fragment shader, then for sprite 100x100 pixels, the same calculation can be repeated 10000 times.
     
  11. Xiromtz

    Xiromtz

    Joined:
    Feb 1, 2015
    Posts:
    65
    @Tom-Atom I'm guessing you're talking about this component?`
    https://docs.unity3d.com/Manual/class-SpriteAtlas.html

    I've never used it before and I always do the atlas packing myself, since I've never had too optimize it too much to need it to be automated. If you're using that component my solution won't work, you are correct.

    Though I do believe there are programs that do the sprite packing for you and give you back an atlas in the form of a normal, compressed image (i.e. png, jpg,..), allowing you to still know exactly where and how to access the UVs of single sprites.

    As I said, your solution is nicer but more work to setup. Your points on batches breaking, etc. are probably true. It depends on the problem and how much work one is willing to input - "Premature Optimization is the root of all evil"
     
    Tom-Atom likes this.
  12. mbitzos

    mbitzos

    Joined:
    Jun 8, 2018
    Posts:
    17
    I guess the reason why thought the sprite renderer should have this information is just because how else would it know which sprite to renderer with an entire sprite sheet? I assume theres a way behind the scenes the sprite renderer only displays a segment of the sprite sheet (since like you said the shader doesnt have access to this). I have no idea how this works btw so im just shooting out hypotheticals.

    Still sucks though because this seems like thing that everyone working with sprite sheets and shaders would run into would they not? I guess everyone just does their own "hacky" fixes to get this work. Which I think leads itself to the question that is,if everyone is doing their own solutions to get this to work why not have this at least explored as a feature (in the form of a component/engine code etc).

    Thanks for your detailed reply it was informative.
     
  13. Tom-Atom

    Tom-Atom

    Joined:
    Jun 29, 2014
    Posts:
    153
    Each sprite is smiliar to 3D mesh, but it is flat. Under similarity I mean: it is some mesh that has position of each vertex in object space and also uv coordinate in texture. These are fixed and I believe it is Unity's job to (re)calculate this when you decide to use atlas. Into game it already comes calculated. (I will be happy if someone with better insight will correct or adjust this)

    Anyway, you can read both model vertex positions and uvs through code. See Sprite: https://docs.unity3d.com/ScriptReference/Sprite.html there are vertices, uv and triangles. These are passed to shader. Shader does not need anything else than UV to find position in texture. As vertices in shader are processed individually, without possibility to look on any other vertex, you do not know, whether vertex with UV.y = 0.2456 is in top, bottom or middle of sprite - it is just one vertex of one of triangles that form whole mesh.
    If you want to have some information, like top of sprite in atlas, then you have on C# side iterate through Sprite.uv and pass this information into shader as uniform. Such information is available to all vertices in vertex shader and to all fragments in fragment shader. But, you have to change it for each sprite and it breaks batch ... 1 sprite = 1 draw call.
    Or do some "hack" and pass this information along with position of vertex and uv for vertex. Then this information is passed for every vertex (so, if you have 20 vertices, it is passed 20 times), but that individual vertex, that does not know anything about other vertices can use it. So, main problem is, that you can't simply add additional per vertex data with SpriteRenderer (which you can do with MeshRenderer)
     
  14. Xiromtz

    Xiromtz

    Joined:
    Feb 1, 2015
    Posts:
    65
    Yeah Unity has these kinds of things, especially when it comes to 2D, sadly.. The irony is that Unity is said to be the best 2D engine out there. For a new project I might consider making everything 3D, with an orthographic camera an using plances with textures instead of sprites you would get the same look and feel, but maybe more Unity features to work with...
    I've gone through a lot of issues with things that would seem obvious, but they are at least doing frequent engine updates, so maybe there will be a point in time where we won't have to do workarounds for everything..
     
  15. mbitzos

    mbitzos

    Joined:
    Jun 8, 2018
    Posts:
    17
    Oh i understand now why it easiest as i thought for the sprite renderer to just make this work. Thanks its always useful to learn more about how the sprites work under the hood so i can understand their limitations better
     
  16. GamerXP

    GamerXP

    Joined:
    Mar 22, 2014
    Posts:
    74
    I wonder if we can request feature for sending original sprite UVs as second set of texture coordinates. This will help a lot of shaders that have to work with atlas packed sprites. I, pretty much, always hit my head into this when writing shaders.
    Secondary texture with UV lookups will work of course (when they make atlases for secondary textures), but it will have worse performance in many ways.
     
  17. bearcoree

    bearcoree

    Joined:
    Mar 8, 2016
    Posts:
    72
    Just ran into this exact problem too.
    I have sprites in an atlas and a shader needs to know the per-sprite top and bottom uv's.

    Having this information passed by the SpriteRenderer Component as TEXCOORD3 or something would save so much work and make everything much cleaner.
     
  18. andiB

    andiB

    Joined:
    Jan 13, 2013
    Posts:
    22
    I don't understand how _MainTex (Sprite in an Atlas) is getting renderd correctly because it shares the same UV cords as my second texture I want to use in the Shader Graph.

    So the information about the cropped UV space must be in the render pipeline somewhere. It would be awesome to get access to that cropped UV space... it's a problem I struggling for days now :/
     
    Reahreic, dr4 and bearcoree like this.
  19. TimBur

    TimBur

    Joined:
    Jan 17, 2013
    Posts:
    35
    Bump. Shader Graphs and Sprite Atlases are both great tools. I wish that Unity devs would make some design changes to make it easier for these things to work together.

    The relevant entry in the official Unity Issue Tracker dismisses the problem, saying that all is working as intended. That this situation is "By Design." I find this a little disappointing. It should be on a to-do list somewhere, even if that is only a distant wishlist: https://issuetracker.unity3d.com/is...individual-sprite-when-entering-the-play-mode
     
    comealong, iDerp69, dr4 and 3 others like this.
  20. dr4

    dr4

    Joined:
    Jan 14, 2015
    Posts:
    106
    right now we had to remove all our shaders to increase the performance of our game with Atlas, now turns out that we need to choose between Atlas or shaders because Unity does not consider that we may want to use it together, dismiss such an important thing as "it is by design" is like having a bug in your game and discarding it as "it is a feature", I'm baffled that no one is looking into this at all
     

    Attached Files:

    Immu likes this.
  21. Reahreic

    Reahreic

    Joined:
    Mar 23, 2011
    Posts:
    254

    This implies that handwritten shaders would work with atlases, and it's just a ShaderGraph thing. Or am I misunderstanding?
     
  22. dr4

    dr4

    Joined:
    Jan 14, 2015
    Posts:
    106
    I can't tell, I have 0 clue of how hand write shaders and relied heavily on shader graph since it came out, and that hit hard when we started porting to Nintendo Switch.
     
  23. Reahreic

    Reahreic

    Joined:
    Mar 23, 2011
    Posts:
    254
    HLSL is certainly an acquired taste, that's for sure.
     
  24. Lancival

    Lancival

    Joined:
    Oct 23, 2018
    Posts:
    2
    After struggling with this problem for some time, I found a fairly simple workaround: you can grab the UV coordinates of a sprite in a script, and provide them to a Shader Graph via a MaterialPropertyBlock. In Shader Graph, you can then remap the UV coordinates with the min/max UV values the sprite covers. Here's a sample script to do this:
    Code (CSharp):
    1. using UnityEngine;
    2.  
    3. [RequireComponent(typeof(SpriteRenderer))]
    4. public class UVFromSprite : MonoBehaviour
    5. {
    6.     void Awake()
    7.     {
    8.         SpriteRenderer spriteRenderer = GetComponent<SpriteRenderer>();
    9.         float minU = 1;
    10.         float maxU = 0;
    11.         float minV = 1;
    12.         float maxV = 0;
    13.         foreach (Vector2 uv in spriteRenderer.sprite.uv) {
    14.             minU = Mathf.Min(uv.x, minU);
    15.             maxU = Mathf.Max(uv.x, maxU);
    16.             minV = Mathf.Min(uv.y, minV);
    17.             maxV = Mathf.Max(uv.y, maxV);
    18.         }
    19.  
    20.         MaterialPropertyBlock block = new MaterialPropertyBlock();
    21.         spriteRenderer.GetPropertyBlock(block);
    22.         block.SetVector("U", new Vector2(minU, maxU));
    23.         block.SetVector("V", new Vector2(minV, maxV));
    24.         spriteRenderer.SetPropertyBlock(block);
    25.     }
    26. }
    27.  
    For the sake of simplicity, the sample script just calculates the values in Awake(), but it should only take some minor modification to have the script precalculate the correct values in the editor to avoid extra computation during gameplay.
     
    vambier and leohilbert like this.
  25. GamerXP

    GamerXP

    Joined:
    Mar 22, 2014
    Posts:
    74
    Hmm. Sounds pretty easy to work with. Only issue is.. if I remember correctly, adding MaterialPropertyBlock breaks batching of that component, so it's not that great for performance.
    But if we go with a similar solution - SRP batcher may, actually, work better for such cases - it batches different materials with slightly different properties if they actually support SRP.
     
    leohilbert likes this.
  26. leohilbert

    leohilbert

    Joined:
    Nov 28, 2015
    Posts:
    18
    Yeah, this will break batching, but I don't see a way to get around it apart from the method mentioned earlier where you generate a second texture to sample the local UV from. However this requires a lot of engineering effort and adds tons of new textures, depending on how many of your SpriteSheets need this effect on.
    If used in moderation I think Lancival's approach of calculating the UV Range via script is a cool solution! In case somebody wants to plug this into ShaderGraph:
    uv_remap_example.png
    You can use the resulting SpriteUV in place of the UV0. Keep in mind that the SampleTexture2D node still needs to use the original UV0!

    When using this with sprite animations you might run into a problem where your UV is offset slightly for each frame, because the frames are cropped and therefore slightly different in size.
    For this I came up with a solution where you define a fixed width&height for the UV and set it based on the pivot point of the Sprite. That way the UV is always the same "size" and always originates at the pivot point:

    Code (CSharp):
    1. private Vector4 CalcUvRange(Sprite sprite)
    2. {
    3.     Vector2 textureSize = new(sprite.texture.width, sprite.texture.height);
    4.     Vector2 fixedSize = spriteUvSize * sprite.pixelsPerUnit / textureSize;
    5.  
    6.     Vector2 spriteUvPos = CalcSpriteUvPos(sprite);
    7.     spriteUvPos += sprite.pivot / textureSize;
    8.     spriteUvPos += spriteUvOriginOffset * fixedSize;
    9.  
    10.     return new Vector4(
    11.         spriteUvPos.x, spriteUvPos.x + fixedSize.x,
    12.         spriteUvPos.y, spriteUvPos.y + fixedSize.y
    13.     );
    14. }
    15.  
    16. private static Vector2 CalcSpriteUvPos(Sprite sprite)
    17. {
    18.     Vector2 uvPos = Vector2.one;
    19.     foreach (Vector2 uv in sprite.uv)
    20.     {
    21.         uvPos.x = Mathf.Min(uv.x, uvPos.x);
    22.         uvPos.y = Mathf.Min(uv.y, uvPos.y);
    23.     }
    24.  
    25.     return uvPos;
    26. }
     
    vambier, MaxPirat and comealong like this.
  27. vambier

    vambier

    Joined:
    Oct 1, 2012
    Posts:
    102
    Thanks for this, I think the solution of Lancival is quite elegant and it works perfectly! And thanks for clarifying how to use the resulting values in shadergraph!
    I do have a question about your code though, first a small fix for textureSize, it should be new Vector2() :)
    But I do have two variables which are not declared : spriteUvSize and spriteUvOriginOffset, could you add those to your code? I can figure it out myself but it's also easier for others who stumble upon this thread in the future:)
     
  28. leohilbert

    leohilbert

    Joined:
    Nov 28, 2015
    Posts:
    18
    Hi, "new(..)" is called a "target-typed new expressions" and is valid syntax since C# 9. :) The type declaration already provides the Vector2 typing so writing it again for the constructor call is not necessary.

    As for spriteUvSize and spriteUvOriginOffset, they are both Vector2 fields and where they come from heavily depends on your code structure and how you want to use the snippet. I use this method in a MonoBehaviour and have them defined as a SerializeField like this:
    Code (CSharp):
    1. [SerializeField] private Vector2 fixedUvSize;
    2. [SerializeField] private Vector2 spriteUvOriginOffset;
    My snippet was not supposed to be a copy&paste solution, but an approach to make it work in your own code base. You basically need to configure both values so it looks good with your current animation, there is no "correct" way to calculate them as it depends on multiple frames.

    With that said you can calculate the fixedUvSize & spriteUvOriginOffset for the current Sprite like this:
    Code (CSharp):
    1. [Button, UsedImplicitly]
    2. private void SetFixedParamsFromCurrentSprite()
    3. {
    4.     Sprite spr = spriteRenderer.sprite;
    5.     Vector4 calculatedUVRange = CalcSpriteUvRange(spr);
    6.     Vector2 minUV = new(calculatedUVRange.x, calculatedUVRange.z);
    7.     Vector2 maxUV = new(calculatedUVRange.y, calculatedUVRange.w);
    8.  
    9.     Vector2 textureSize = new(spr.texture.width, spr.texture.height);
    10.     Vector2 uvSize = maxUV - minUV;
    11.     fixedUvSize = uvSize * textureSize / spr.pixelsPerUnit;
    12.  
    13.     Vector2 spriteUvPos = CalcSpriteUvPos(spr);
    14.     spriteUvPos += spr.pivot / textureSize;
    15.     spriteUvOriginOffset = (minUV - spriteUvPos) / uvSize;
    16. }
    17.  
    18. private static Vector4 CalcSpriteUvRange(Sprite sprite)
    19. {
    20.     Vector4 range = new(1, 0, 1, 0);
    21.     foreach (Vector2 uv in sprite.uv)
    22.     {
    23.         if (uv.x < range.x) range.x = uv.x;
    24.         if (uv.x > range.y) range.y = uv.x;
    25.         if (uv.y < range.z) range.z = uv.y;
    26.         if (uv.y > range.w) range.w = uv.y;
    27.     }
    28.  
    29.     return range;
    30. }
    31.  
    32. private static Vector2 CalcSpriteUvPos(Sprite sprite)
    33. {
    34.     Vector2 uvPos = Vector2.one;
    35.     foreach (Vector2 uv in sprite.uv)
    36.     {
    37.         if (uv.x < uvPos.x) uvPos.x = uv.x;
    38.         if (uv.y < uvPos.y) uvPos.y = uv.y;
    39.     }
    40.  
    41.     return uvPos;
    42. }
    Just keep in mind that this is only the UV for this one sprite and not every sprite in your animation.
     
    Last edited: Oct 10, 2023