Search Unity

UV coordinates limited 0...1 -- why?

Discussion in 'Scripting' started by syscrusher, Sep 20, 2018.

  1. syscrusher

    syscrusher

    Joined:
    Jul 4, 2015
    Posts:
    1,104
    According to the Unity Manual page, "Anatomy of a Mesh", UV coordinates are limited in range from zero to one (float). Having created many models in Blender, I am curious why this assumption is made in Unity.

    Below is an example of a 3D model (viewed from above, so almost an orthogonal view) showing some polygons in situ on the left, and the corresponding UV map on the right.

    UV-CoordsDemo-2.jpg

    As you can see, the UV map is deliberately outside the range 0...1 to cause the texture bitmap to be visually smaller, effectively forcing some tiling. This is not uncommon with tileable materials that aren't designed to directly 1:1 map to specific geometry, but rather are generic surfaces such as painted walls, concrete, asphalt pavement, or dirt.

    I am working with two assets from different publishers at the moment; one asset produces UV maps that have coordinates outside the 0...1 range. The other asset throws warnings or errors when it tries to process this geometry, and the author (correctly) cites the Unity manual for why these UVs are invalid.

    As an aside, I consulted the documentation for the Mesh class, as well as the linked page for the "uv" property, and there is no mention of this 0...1 limit.

    I realize the same effect can be achieved by using smaller UV coordinates and then scaling the texture tiling in the Unity material, but that would affect all polygons where that material is used. In the example above, only this one area is to have the intentionally scaled texture.

    What is the rationale behind this specification for UV coordinates in the Mesh class? Is it actually valid to assume this, and in fact, does Unity actually require/assume this, or is that "Anatomy of a Mesh" page wrong? I'm looking for some clarification on how Unity calculates UVs internally, and whether there is another recommendation on how to do this sort of thing, so I can feed this back to the asset developers. Thanks!
     
  2. halley

    halley

    Joined:
    Aug 26, 2013
    Posts:
    2,444
    Unity chose to use 0-to-1 ranges for UV so that it would be independent of the pixel dimensions of the texture. When you're working with mipmaps or downsampled texture import settings, it's useful to ignore the pixel dimensions of the texture in your code and assets... Unity will simply multiply the 0-to-1 value by the actual pixel dimensions in use at the time to get the texture pixels required.

    Now, there's really no technical reason you couldn't give UVs outside the 0-to-1 range. Did you set the texture wrapping mode to "clamp" instead of "repeat"? I didn't realize Unity would whine at you for UVs outside the range.
     
    syscrusher likes this.
  3. syscrusher

    syscrusher

    Joined:
    Jul 4, 2015
    Posts:
    1,104
    Thanks for the reply; you've given me an idea of what may be going on here with respect to the manual's statement.

    I understand the pixel-independent part of your comment. Even within Blender (I can't speak for other 3D modelers), the UV coordinates are a relative value, so what you say rings true. But, as you note, the fact that they are normalized to 1.0 being the dimension of the texture bitmap corner-to-corner does not imply that the coordinates have to be within that range.

    Now I wonder if what Unity meant to say here is that they are scaled relative to that range, rather than that they are limited to that range.

    The texture mode is set to repeat in this case.

    Unity itself isn't complaining; it's that Asset A generates a model with UVs outside the 0...1 range, and Asset B complains about it, and we're trying to figure out what the underlying assumptions or constraints of the Unity engine are with respect to these values.

    Thanks again for the comment. We may just be dealing with a phrasing issue in the manual.
     
  4. Owen-Reynolds

    Owen-Reynolds

    Joined:
    Feb 15, 2012
    Posts:
    1,998
    Ignore that page -- you already know more about how UV's work. The stuff there is for complete beginners in 3D modeling. Models in Unity, including UV coords, work the normal way they do everywhere else.

    I use UV coords in Unity outside of 0-1 all the time. Unity tiles them exactly as you'd think. The author of that second package is probably making some assumptions, and adding extra checks. Most hand-painted 3D models never tile, never even re-use the same area in two places, and never want to go off or even near an edge -- think of a standard box-unwrap that looks like an unfolded paper box. For those, not 0-1 means someone made a mistake. Some pros only make models that way.
     
    syscrusher likes this.
  5. Eric5h5

    Eric5h5

    Volunteer Moderator Moderator

    Joined:
    Jul 19, 2006
    Posts:
    32,401
    UV coords aren't limited to 0..1. They're effectively always in that range because of wrapping; i.e. 2.0 is the same as 1.0. However certain GPUs have/had limitations regarding precision, where going too far from 0..1 resulted in distortion (like old iPhones), so trying to stay close to 0..1 was advisable, but not sure how relevant that is these days.

    --Eric
     
    syscrusher likes this.
  6. syscrusher

    syscrusher

    Joined:
    Jul 4, 2015
    Posts:
    1,104
    Thanks for the reply. The box unwrap analogy makes sense -- I was actually wondering if the hand-painted modeler unwrapping was part of the rationale for that text in the docs, so this corresponds to my thoughts as well.
     
  7. syscrusher

    syscrusher

    Joined:
    Jul 4, 2015
    Posts:
    1,104
    Ahhhh....thanks. I know about the floating point precision issues in large open-world terrain maps, but I didn't realize it was an issue in UV coordinates that get too large. The ones I'm seeing are small numbers, in the "ones and tens" and not "thousands".
     
  8. Owen-Reynolds

    Owen-Reynolds

    Joined:
    Feb 15, 2012
    Posts:
    1,998
    That explanation, which I realize is the standard one, always confused me. Say an edge starts at 0.9 and ends at 1.1. If you're not careful you imagine travelling backwards 0.8 units through the center to 0.1.

    I find it easier to think of the original texture as tiled, and texture coords always as their assigned values. 1.1 is 1.1, which just happens to be on the second copy of the texture.
     
    jeffdyerSRTMarine likes this.
  9. Eric5h5

    Eric5h5

    Volunteer Moderator Moderator

    Joined:
    Jul 19, 2006
    Posts:
    32,401
    No, those are totally separate issues. The problem with large UV coordinates is texture distortion on certain old iPhone chips; I don't know if that's the case on modern ones.

    --Eric
     
    syscrusher likes this.
  10. halley

    halley

    Joined:
    Aug 26, 2013
    Posts:
    2,444
    If your "Asset B" which complains about such things is a texture atlassing tool, I could see why it would be a little concerned with such coordinates. In those cases, it will have to do the shrinking-down-and-artificially-tiling the texture to remain compatible, at a loss of texture space and precision if this was not actually an unintentional oversight on the part of "Asset A."
     
  11. syscrusher

    syscrusher

    Joined:
    Jul 4, 2015
    Posts:
    1,104
    Oh, I understand that after your post -- I just meant that I didn't understand it *before*. :)
     
  12. syscrusher

    syscrusher

    Joined:
    Jul 4, 2015
    Posts:
    1,104
    Asset A is Archimatix, and Asset B is the new Bakery GPU lightmapper. It appears Archimatix is not getting good auto-generated UV2s for lightmapping because of (possibly) a bug in Unity's "Unwrapping" utility class. We're investigating. In the meantime, Bakery tries to gracefully fall back to using UV1 (aka "UV") for the lightmaps, which is a sensible workaround, but Bakery doesn't like the coordinates outside 0...1. The author of Bakery was kind enough to alter the code to warn about the UVs but not actually *fail*, which has got my test scene working. My goal with this thread was to figure out how things are *supposed* to work, so I can give more meaningful test reports back to the two developers.

    Both of the devs are trying to be helpful, but Unity's documentation for UV coordinates confused everyone involved. Thanks for all the very fast replies!
     
  13. halley

    halley

    Joined:
    Aug 26, 2013
    Posts:
    2,444
    Hey, progress is built on confusion and complaints. :)

    It would make sense for a lightmapper to be concerned about its own UVs, but kinda odd that it would complain about the existing UVs in another map. A lightmapper's UVs should also not allow polys to overlap in UV space, but that's normal and common in visual textures, such as allowing one eye to be painted for both eyes of a character, or many polygons to draw from the same leaf in a tree. Maybe the lightmapper was hoping to recycle the original UV map if it complied with those constraints before adding a new UV map. Glad you worked it out with the developer.
     
  14. syscrusher

    syscrusher

    Joined:
    Jul 4, 2015
    Posts:
    1,104
    Agreed on all points, especially the overlapping in visual UVs. I use that trick all the time! ;)

    The way I understand the lightmapper dev, I think the validation of the coordinate range, orginally, was just trying to build good quality, defensively-written code that validates its input data. They saw the mesh article, interpreted it as stating a data requirement, and validated accordingly. Also, by testing with a customized shader, we've learned that automated UV2 unwrapping from the Unwrapping class in Unity produces UV2s that slightly overlap, which is also not cool. We're working to try to pin that down more accurately for a meaningful bug report to the Unity team.