Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Question Converting World Space coords to local UVs on UI element (built-in, frag shader)

Discussion in 'Shaders' started by Senshi, Oct 6, 2021.

  1. Senshi

    Senshi

    Joined:
    Oct 3, 2010
    Posts:
    557
    Hi all,

    I'm sure I'm missing something obvious here, but I've been banging my head against this for a while now.

    In my UI shader I am using a global Vector to mask out an area. I do this with the help of
    float d = saturate(distance(IN.worldPosition.xy, _Target.xy) * _MaskSize);
    in my fragment shader.

    However, for further effects I also need to know which local UV coordinates correspond to this world space
    _Target.xy
    position. I.e.: I need to know, locally, where this target is. How would I convert between the two spaces here?

    Thanks in advance!

    EDIT: I got my specific use-case working, but would still love to hear an answer to this out of curiousity :)
     
    Last edited: Oct 6, 2021
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,329
    For something like UI, then you need to know the transform from "world" space to "uv" space, which is going to be the UI object's local space with some additional scaling and positional offset to account for the UV. The solution is to calculate this in c# and pass it to the material. But this only really works for UI elements which are essentially guaranteed to be flat, uniformly UV'd quads or sprites. On more complex geometry you'd need this information per triangle as the UVs are arbitrary. At that point the "solution" basically requires you have the entire mesh's data accessible in some to search through, and with arbitrary meshes there's not necessarily one solution since UVs and geometry might overlap.

    Basically you're describing the primary problem any 3D model painting program has to solve. And some of them handle it by punting on the problem and baking the mesh data into 3D look up textures (aka "voxels") that you're actually painting and then reproject back onto the mesh. And others solve it by being dog slow and buggy.
     
    Senshi likes this.