Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We have updated the language to the Editor Terms based on feedback from our employees and community. Learn more.
    Dismiss Notice
  3. Join us on November 16th, 2023, between 1 pm and 9 pm CET for Ask the Experts Online on Discord and on Unity Discussions.
    Dismiss Notice
  4. Dismiss Notice

Resolved Texture repeats itself within single border triangles

Discussion in 'Shaders' started by hecubah, Dec 29, 2020.

  1. hecubah

    hecubah

    Joined:
    Jul 30, 2017
    Posts:
    5
    I am creating a visualization tool for a hobby research into planets and currently I struggle with a uv texture problem.

    Problem: I want to map a texture on a sphere using custom mesh. After recalculating uvs for my surface shader, the texture is displayed correctly but for horizontal borders. Turns out the triangles between borders are painted with the whole horizontal range of the texture, creating a 'seam' (see Fig. 1). The texture is just for testing [1].

    fq1.png
    Fig. 1: Seam on the texture border.​

    Background: The mesh is a Delaunay triangulation of Fibonnaci spherical samples [2]. Using internal scene data, I first create a texture, which is then uv-mapped on the surface of the sphere. Mapping is done simply by translating the spherical coordinates (y,z axes switched because Unity)[3] linearly as: [phi:0..2*pi; theta:0..pi] -> [x:0..1; y:1..0]. The need for a texture arises from the fact that I want to show more details than the standard uint16 max number of vertices alows me to render simply by vertex/fragment shading. For various reasons I really need to write a custom surface shader.

    Details: I tested the uv value ranges and they seem correct. y goes from 0 to 1, as does x. The obvious problem is when x goes from something like 0.98 straight to 0.05. These vertices are exactly the points where the texture suddenly repeats itself within a single triangle. Also, when using the built-in Unity 3D sphere and use the same custom shader with the same texture on it, it renders correctly without the seam (see Fig. 2). It leads me to a conclusion that either I need to configure the shader further or my uv-mapping is wrong. Weird is, when I use the same uv-mapping to create a procedural texture, the texture is fine (on a plane).

    fq2.png
    Fig.2: No seam on built-in 3D sphere​

    My uv-mapping function:
    Code (CSharp):
    1. // out parameters are assigned to a GameObject Renderer sharedMesh
    2. public void TopologyToMeshUV(out Vector3[] vertices_array, out int[] triangles_array, out Vector2[] uvs_array
    3. {
    4.     // m_Vertices are Vector3 points in XYZ space, laying on the surface of a sphere
    5.     Vector3[] vector_wip_array = m_Vertices.ToArray();
    6.     Vector2[] uvs_wip_array = new Vector2[vector_wip_array.Length];
    7.     // calculate uv for every point
    8.     for (int i = 0; i < vector_wip_array.Length; i++)
    9.     {
    10.         // horizontal angle - positive latitude
    11.         float phi_angle_norm = Mathf.Atan2(vector_wip_array[i].z, vector_wip_array[i].x) / (2 * Mathf.PI);
    12.         // vertical angle - positive altitude starting from the north pole
    13.         float theta_angle_norm = Mathf.Acos(vector_wip_array[i].y / vector_wip_array[i].magnitude) / Mathf.PI
    14.         // normalized phase shift after atan2, if omitted, the seam only moves
    15.         if (phi_angle_norm < 0)
    16.         {
    17.             phi_angle_norm += 1;
    18.         }
    19.         // direct mapping
    20.         uvs_wip_array[i].x = phi_angle_norm;
    21.         // reverse mapping, uv.y starts from the bottom of a texture
    22.         uvs_wip_array[i].y = 1 - theta_angle_norm;
    23.     }
    24.  
    25.     // ... the rest is just for mesh assignment ...
    26.     // copy reference
    27.     vertices_array = vector_wip_array;
    28.     // copy reference
    29.     uvs_array = uvs_wip_array;
    30.     // number of triangles on the sphere
    31.     int triangles_count = m_Triangles.Count;
    32.     // for filling the triangle vertex indices
    33.     triangles_array = new int[3 * triangles_count];
    34.     for (int i = 0; i < triangles_count; i++) // filling the indices
    35.     {
    36.         triangles_array[3 * i] = m_Triangles[i].m_A; // first vertex index
    37.         triangles_array[3 * i + 1] = m_Triangles[i].m_B; // second vertex index
    38.         triangles_array[3 * i + 2] = m_Triangles[i].m_C; // third vertex index
    39.     }
    40.  
    41. }
    My surface shader (pretty much the basic one):
    Code (CSharp):
    1. Shader "Custom/SphereTextureShader" {
    2.     Properties
    3.     {
    4.         _Color("Color", Color) = (1,1,1,1)
    5.         _MainTex("Albedo (RGB)", 2D) = "white" {}
    6.         _Glossiness("Smoothness", Range(0,1)) = 0.5
    7.         _Metallic("Metallic", Range(0,1)) = 0.0
    8.     }
    9.         SubShader
    10.         {
    11.             Tags { "RenderType" = "Opaque" }
    12.             LOD 200
    13.  
    14.             CGPROGRAM
    15.             #pragma surface surf Standard fullforwardshadows
    16.             #pragma target 3.0
    17.  
    18.             sampler2D _MainTex;
    19.  
    20.             struct Input
    21.             {
    22.                 float2 uv_MainTex;
    23.             };
    24.  
    25.             half _Glossiness;
    26.             half _Metallic;
    27.             fixed4 _Color;
    28.  
    29.             UNITY_INSTANCING_BUFFER_START(Props)
    30.             UNITY_INSTANCING_BUFFER_END(Props)
    31.  
    32.             void surf(Input IN, inout SurfaceOutputStandard o)
    33.             {
    34.                 fixed4 c = tex2D(_MainTex, IN.uv_MainTex);
    35.                 o.Albedo = c.rgb * _Color.rgb;
    36.                 o.Metallic = _Metallic;
    37.                 o.Smoothness = _Glossiness;
    38.                 o.Alpha = c.a * _Color.a;
    39.             }
    40.             ENDCG
    41.         }
    42.             FallBack "Diffuse"
    43.  
    44. }
    45.  
    Sources:

    [1] Maps of the World. SURFERTODAY [online]. 2020. [Accessed 29th December 2020]. Available from: https://www.surfertoday.com/environment/maps-of-the-world

    [2] KEINERT, Benjamin, Matthias INNMANN, Michael SÄNGER and Marc STAMMINGER. Spherical fibonacci mapping. ACM Transactions on Graphics [online]. 2015, 34(6), 1-7 [cit. 2020-12-29]. ISSN 0730-0301. doi:10.1145/2816795.2818131

    [3] Spherical coordinate system. Wikipedia: the free encyclopedia [online]. San Francisco (CA): Wikimedia Foundation, 2001- [cit. 2020-12-29]. Available from: https://en.wikipedia.org/wiki/Spherical_coordinate_system
     
    Last edited: Dec 29, 2020
  2. AlexTorbin

    AlexTorbin

    Joined:
    Dec 12, 2019
    Posts:
    48
    Sorry I didn't read the whole text, but most likely you need to create a geometry seam. Looks like all vertices on your sphere are merged, but a single vertex can't have 2 UV's in the same set. So you need to manually split verts along some edges to create texture seam, or multiple seams, depending on the texture. upload_2020-12-29_23-13-37.png
     
    hecubah likes this.
  3. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,243
    As you already figured out, if you UV wrap a sphere (or really any mesh) with a circular UV, at some point the UV jumps from almost 1.0 back to 0.0 in the space of a single triangle. And that the default Unity meshes don't have this problem. Indeed, if you recreate your mesh in an external tool and use the built in UV wrapping tools it won't be an issue for that mesh either.

    The reason why is because there's a seam in those meshes. There is no point where the UVs go "backwards", instead there's a line of vertices that are doubled. The UVs for the triangles on one side go from something like 0.95 to 1.0, and on the other side they go from 0.0 to 0.05. These duplicate vertex seams are the standard way any kind of discontinuity in mesh data is handled.* Hard edged normals? Per face colors? Duplicate the vertices.

    * For real time rendering with GPUs. Modelling tools do all sorts of wacky stuff with per-face data or storing multiple sets of data in a single vertex per attached face.

    However that's not actually required for this specific setup. Since your UVs are easily derived directly from the mesh position alone, you don't even need vertex UVs at all. You can instead calculate the same spherical coordinates using the Acos and Atan2 you're doing in the c# in the shader itself to compute the UVs from the vertex positions. Now you don't have to worry about the UV seam at all! Indeed, Unity even ships with a equirectangular skybox shader that uses this same technique!

    But, it's not all puppies and rainbows. Technically the seam still exists! Just now it's between two pixels and you get nasty artifacts like this:

    The usually prescribed solution is to disable mipmaps on the texture, which works okay for skyboxes, but not great for things that aren't skyboxes.

    The reason why disabling mipmaps works is because of how the GPU calculates the mip level to use. GPUs use screen space partial derivatives of the UVs, or in layman terms how much the UV changes from one pixel to the next, to determine which mip level to use. This is done on a 2x2 pixel grid (called pixel quads) across the screen, so if the UVs change from 0.99 to 0.01 between two pixels in one of those quads, it'll think it needs to use basically the smallest mip map available since it thinks essentially the entire texture is being displayed in just those two pixels.

    But all is not lost, there's one more crazy hack to fix this.
    http://vcg.isti.cnr.it/~tarini/no-seams/

    The idea is have two UV sets, both aligned the same, but one using a 0.0 to 1.0 range and the other a -0.5 to 0.5 range so the seam is in a different spot. Look at the derivatives for each, and use the UV set with the smallest derivatives. For spherical coordinates, you don't need to calculate this twice, but instead calculate the spherical coordinate UVs once, then do:
    Code (csharp):
    1. float2 uv2 = frac(uv1 + 0.5) - 0.5;
    That's assuming the original UVs are already in a 0.0 to 1.0 range. When calculating spherical coordinates using atan2, the x is already in a -0.5 to 0.5 range, so really you just need:
    Code (csharp):
    1. float2 uv2 = frac(uv1);
    The code to select between the two UV sets is in the above link as an image, but I'll write it out again here in a slightly easier to read form:
    Code (csharp):
    1. // the fwidth function is the sum of the absolute vertical & horizontal derivatives
    2. // basically how much the a value is changing in all directions
    3. // done in a single line, but ternary ops are done per component, so this is still comparing the x and y separately
    4. float2 uv = fwidth(uv1) < fwidth(uv2) ? u1 : uv2;
    The other piece to this puzzle is getting the un-modified vertex positions in the Surface Shader. For that you'll need a basic custom vertex function to pass the vertex position to the surf function. With all those pieces you get something like this:
    Code (csharp):
    1. Shader "Custom/EquirectangularSphere"
    2. {
    3.     Properties
    4.     {
    5.         [HideInInspector] _Color ("Color", Color) = (1,1,1,1)
    6.         [NoScaleOffset] _MainTex ("Equirectangular Albedo (RGB)", 2D) = "white" {}
    7.         _Glossiness ("Smoothness", Range(0,1)) = 0.5
    8.     }
    9.     SubShader
    10.     {
    11.         Tags { "RenderType"="Opaque" "DisableBatching"="True" }
    12.         LOD 200
    13.  
    14.         CGPROGRAM
    15.         #pragma surface surf Standard fullforwardshadows vertex:vert
    16.  
    17.         #pragma target 3.0
    18.  
    19.         sampler2D _MainTex;
    20.  
    21.         struct Input
    22.         {
    23.             float3 vertex;
    24.         };
    25.  
    26.         half _Glossiness;
    27.         half _Metallic;
    28.         fixed4 _Color;
    29.  
    30.         void vert (inout appdata_full v, out Input IN)
    31.         {
    32.             IN.vertex = v.vertex;
    33.         }
    34.  
    35.         void surf (Input IN, inout SurfaceOutputStandard o)
    36.         {
    37.             // -0.5 to 0.5 range
    38.             float phi = atan2(IN.vertex.z, IN.vertex.x) / (UNITY_PI * 2.0);
    39.  
    40.             // 0.0 to 1.0 range
    41.             float phi_frac = frac(phi);
    42.  
    43.             // negate the y because acos(-1.0) = PI, acos(1.0) = 0.0
    44.             float theta = acos(-normalize(IN.vertex.xyz).y) / UNITY_PI;
    45.  
    46.             // construct the uvs, selecting the phi to use based on the derivatives
    47.             float2 uv = float2(
    48.                 fwidth(phi) < fwidth(phi_frac) ? phi : phi_frac,
    49.                 theta // no special stuff needed for theta
    50.                 );
    51.  
    52.             fixed4 c = tex2D (_MainTex, uv);
    53.             o.Albedo = c.rgb;
    54.             o.Smoothness = _Glossiness;
    55.         }
    56.         ENDCG
    57.     }
    58.     FallBack "Diffuse"
    59. }

    If you want to keep the shader as cheap as possible, you could instead calculate both UVs in c# when generating the mesh. That way you're not doing the acos and atan2 for every pixel. The original paper the
    fwidth
    technique is described in that I linked to above is applying the technique to meshes with two UV sets. But the in-shader setup has the advantage of not having any extra distortion at the poles from UV interpolation.

    Or you can change your mesh generation code to add a seam. Or you can convert your textures from equirectangular textures to cube maps and make this even cheaper.
     
    angrypenguin, lilacsky824 and hecubah like this.
  4. hecubah

    hecubah

    Joined:
    Jul 30, 2017
    Posts:
    5
    Thank you for your answer. I looked into it - frankly, because I have so many vertices in the mesh, it is all done automatically, so I wasn't certain if some of the vertices were not doubled in uv. Unfortunately, the checks in place all passed and the bad connection is apparent even when downscaling to 50 samples. Border vertices were apart by roughly 30°.
    fq3.png
     
  5. hecubah

    hecubah

    Joined:
    Jul 30, 2017
    Posts:
    5
    Not only is this an awesome answer, I am also grateful for the exhaustive explanation. I'll do some follow-up reading on the inserted vertex shader, since it's the only part I'm still not completely clear on. This actually taught me some tricks I could use to eliminate some if-then gymnastics in compute shaders. Thank you again and good UTC night!

    fq4.png fq5.png
     
  6. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,243
    This "inserted vertex" isn't something done in the shader, this is done when constructing the mesh itself. You have to increase the size of vertex position & UV array and add duplicate vertices along the UV seam. The triangles on either side of the seam do not share vertices!

    Here's an example of a low poly geosphere UV mapped in 3ds Max:
    upload_2020-12-29_15-10-0.png
    The green line shows the seam on the mesh where, once imported into Unity, the vertices on that green line are all duplicated. Actually, the poles of the sphere has 5 vertices each since each triangle has a unique UV position at the point! You can also see how the UVs themselves are actually outside of the 0.0 to 1.0 range, with some going < 0.0 and one > 1.0. This is so the sampled position of the texture lines up, since values outside of the 0.0 to 1.0 range are just repeated. So for example that one vertex that's to the right outside the 1.0 range is a value of 1.05. The "matching" UV position on the left side is at 0.05. This gives the illusion of a continuous UV even though it isn't actually continuous.

    However, as mentioned, the UV based approach has unavoidable distortion due to the limited number of UV positions, even in a higher resolution mesh, that the per pixel approach avoids.
    upload_2020-12-29_15-18-17.png
    Notice especially how the UVs at the top pinch due to the actual mesh UVs being a saw tooth with a large amount of the texture between being missed entirely. There's still stretching in the per-pixel approach, due to the spherical projection itself, but at least you'll actually get all of the texture that exists at the poles, and you won't have the wiggly "vertical" lines along the side.
     
  7. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,243
    All that said, I'd still recommend looking into using cubemaps instead.
     
  8. hecubah

    hecubah

    Joined:
    Jul 30, 2017
    Posts:
    5
    The duplicated vertices look exactly as if you were constructing a paper model. Makes sense though, even if it looks like a virtue of necessity. From what I read about cubemaps (have yet to experiment with them to actually see), they really are a way, especially if you only need to show a textured object. For this particular project some pure rectangular texture as a map is needed, so I'll save cubemaps for my virtual lab scenes. My previous question was about the vert function in the surface shader you proposed - wondered why do I need it until it was clear there was no other apparent way to get the vertex position (courtesy of my amateur knowledge about surface shaders).

    I would argue that given proper conditions, equirectangular (or some other conformal) uv-mapping is more straightforward and problem-free. Agreed, a single vertex on a pole means trouble. Practically every sphere I saw in various 3D studios has these. The Fibonacci sampling in my project goes literally around this problem as no vertex is sitting at theta=0 or theta = pi (barring float precision at maybe 50+ million vertices). Rendering a simple texture (Fig. 1) shows exactly what one would expect (Fig. 2). The inevitable loss of detail could be sufficiently compensated using e.g. Mercator inflation.

    20201230a.png
    Fig. 1: Diagonal line with a constant width, equirectangular projection

    20201230b.png
    Fig. 2: Render on a south pole.​

    Using the shader you suggested, the area around South Pole (Figures 3 and 4) looks pretty much perfect.

    20201230d.png
    Fig. 3: Texture around South Pole (texture source [4]) - red spot denotes the pole

    20201230j.png
    Fig. 4: Antarctida texture part, outline created using a constructed satellite map [5] -
    there is additional distortion of the texture because of the camera FOV and mesh geometry​

    Problem with my mesh is that at the scale of 50 vertices (about as many as the icosphere you have shown), the sphere is a bit asymmetrical (Fig. 5). With 100+ vertices, this problem is negligible.

    20201230k.png Fig. 5: Textured sphere, image used from [6]​

    My argument would be then that equirectangular projection textures offer the best results over effort given following conditions:
    • No vertex at the concurrence points, such as poles,
    • reasonably detailed texture around the concurrence points,
    • shaders deal with the uv seam in the way you suggested.

    Sources:

    [4] Equirectangular projection. Wikipedia: the free encyclopedia [online]. San Francisco (CA): Wikimedia Foundation, 2001- [cit. 2020-12-30]. Available from: https://en.wikipedia.org/wiki/Equirectangular_projection

    [5] Map of Antarctica and the Southern Ocean. In: Geology.com [online]. 2008- [cit. 2020-12-30]. Available from: https://geology.com/world/antarctica-satellite-image.shtml

    [6] Layout Uv Mapping. In: Boosterviral [online]. 2020 [cit. 2020-12-30]. Available from: https://boosterviral.weebly.com/home/layout-uv-mapping

    Editorial: Corrected two badly phrased statements.
     
    Last edited: Dec 30, 2020
  9. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,243
    The other benefit of equirectangular maps is they much more "portable" since they're just a single rectangular texture, and are a common projection format a lot of applications support outputting or ingesting.

    The main issue with equirectangular textures is how much texture space is used for areas around the poles vs. around the equator. The middle 1/3rd of the texture area covers roughly 2/3rds of a sphere's surface area. So the average texel density is focused around the poles.

    Cube maps benefit from being able to have much more consistent texel density. The middle "1/3rd" of the equirectangular texture is essentially unchanged in terms of texel density when converting to a cube map (though has a slightly different distortion), it's mainly the poles that are remapped so that they more correctly only use 1/6th of the total texture area each. Unity can automatically convert imported equirectangular maps to cube maps, but you do ideally want to use a much higher resolution equirectangular map than the output cube map to prevent too much loss of detail around the equator, and you will loose some detail on the poles even in a more "1:1" texel resolution (cube map resolution defines the per-face dimensions, so you ideally want the width of the equirectangular texture to be at least 6x that of the cube map resolution, preferably closer to 8x).
     
  10. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,243
    Also, yep. This is a problem with really any low poly sphere. If you really do just want an exact sphere and don't intend to do any kind of surface deformation, you could skip the mesh generation entirely and just raytrace a sphere on a quad.
     
  11. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,243
    One of these spheres is a default Unity sphere mesh using the equirectangular shader above (with specular disabled), the other two of them are camera facing quads, one of which is using the same equirectangular texture as the mesh, the other a cubemap.
    upload_2020-12-30_21-7-35.png
    The middle one is the mesh. The quad on the right is using a cube map.
    upload_2020-12-30_21-9-56.png

    And they can intersect with other objects just fine.
    upload_2020-12-30_21-13-20.png

    The one limitation is they don't cast or receive shadows. They could, but that gets a lot harder. I'm also not using a Surface Shader because you can't modify the fragment output depth which is how I'm making it intersect properly.
    Code (CSharp):
    1. Shader "Custom/Quad Raytraced Sphere"
    2. {
    3.     Properties
    4.     {
    5.         [NoScaleOffset] _MainTex ("Equirectangular Texture", 2D) = "white" {}
    6.         [NoScaleOffset] _CubeTex ("Cubemap Texture", Cube) = "white" {}
    7.         [KeywordEnum(Equirectangular, Cubemap)] _Mapping ("UV Mapping", Float) = 0.0
    8.     }
    9.     SubShader
    10.     {
    11.         Tags { "RenderType"="Opaque" }
    12.         LOD 100
    13.  
    14.         Pass
    15.         {
    16.             Tags { "LightMode"="ForwardBase" }
    17.  
    18.             Cull Off
    19.  
    20.             CGPROGRAM
    21.             #pragma vertex vert
    22.             #pragma fragment frag
    23.  
    24.             #pragma multi_compile_fwdbase nolightmap nodirlightmap nodynlightmap novertexlight
    25.             #pragma shader_feature _ _MAPPING_CUBEMAP
    26.  
    27.             #include "UnityCG.cginc"
    28.  
    29.             sampler2D _MainTex;
    30.             samplerCUBE _CubeTex;
    31.  
    32.             half3 _LightColor0;
    33.  
    34.             struct appdata
    35.             {
    36.                 float4 vertex : POSITION;
    37.             };
    38.  
    39.             struct v2f
    40.             {
    41.                 float4 pos : SV_POSITION;
    42.                 float3 ray : TEXCOORD0;
    43.             };
    44.  
    45.             // https://www.iquilezles.org/www/articles/spherefunctions/spherefunctions.htm
    46.             float sphIntersect( float3 ro, float3 rd, float4 sph )
    47.             {
    48.                 float3 oc = ro - sph.xyz;
    49.                 float b = dot( oc, rd );
    50.                 float c = dot( oc, oc ) - sph.w*sph.w;
    51.                 float h = b*b - c;
    52.                 if( h<0.0 ) return -1.0;
    53.                 h = sqrt( h );
    54.                 return -b - h;
    55.             }
    56.  
    57.             v2f vert (appdata v)
    58.             {
    59.                 v2f o;
    60.  
    61.                 // viewer position, equivalent to _WorldSpaceCAmeraPos.xyz
    62.                 float3 worldSpaceViewerPos = unity_MatrixInvV._m03_m13_m23;
    63.  
    64.                 // calculate a camera facing rotation matrix
    65.                 float3 worldSpaceObjectPos = unity_ObjectToWorld._m03_m13_m23;
    66.                 float3 viewOffset = worldSpaceViewerPos - worldSpaceObjectPos;
    67.                 float3 forward = normalize(viewOffset);
    68.                 float3 right = normalize(cross(forward, float3(0,1,0)));
    69.                 float3 up = cross(right, forward);
    70.                 float3x3 rotMat = float3x3(right, up, forward);
    71.  
    72.                 // get the max object scale
    73.                 float3 scale = float3(
    74.                     length(unity_ObjectToWorld._m00_m10_m20),
    75.                     length(unity_ObjectToWorld._m01_m11_m21),
    76.                     length(unity_ObjectToWorld._m02_m12_m22)
    77.                 );
    78.                 float maxScale = max(abs(scale.x), max(abs(scale.y), abs(scale.z)));
    79.  
    80.                 // use the max scale to figure out how big the quad needs to be to cover the entire sphere
    81.                 // we're using a hardcoded object space radius of 0.5 in the fragment shader
    82.                 float maxRadius = maxScale * 0.5;
    83.  
    84.                 // find the radius of a cone that contains the sphere with the point at the camera and the base at the pivot of the sphere
    85.                 // this means the quad is always scaled to perfectly cover only the area the sphere is visible within
    86.                 float sinAngle = maxRadius / length(viewOffset);
    87.                 float cosAngle = sqrt(1.0 - sinAngle * sinAngle);
    88.                 float tanAngle = sinAngle / cosAngle;
    89.                 float quadScale = tanAngle * length(viewOffset) * 2.0;
    90.  
    91.                 // calculate world space position for the camera facing quad
    92.                 float3 worldPos = mul(v.vertex.xyz * quadScale, rotMat) + worldSpaceObjectPos;
    93.  
    94.                 // calculate and output object space view ray direction for interpolation
    95.                 float3 worldSpaceRayDir = worldPos - worldSpaceViewerPos;
    96.                 o.ray = mul(unity_WorldToObject, float4(worldSpaceRayDir, 0.0));
    97.  
    98.                 // we calculate the ray origin in the fragment because it's just the camera position
    99.  
    100.                 o.pos = UnityWorldToClipPos(worldPos);
    101.  
    102.                 return o;
    103.             }
    104.  
    105.             fixed4 frag (v2f i, out float depth : SV_Depth) : SV_Target
    106.             {
    107.                 // viewer position, equivalent to _WorldSpaceCAmeraPos.xyz
    108.                 float3 worldSpaceViewerPos = unity_MatrixInvV._m03_m13_m23;
    109.  
    110.                 // ray origin
    111.                 float3 objectSpaceCameraPos = mul(unity_WorldToObject, float4(worldSpaceViewerPos, 1.0)).xyz;
    112.  
    113.                 // normalize ray vector
    114.                 i.ray = normalize(i.ray);
    115.  
    116.                 // ray box intersection
    117.                 float rayHit = sphIntersect(objectSpaceCameraPos, i.ray, float4(0,0,0,0.5));
    118.  
    119.                 // above function returns -1 if there's no intersection
    120.                 clip(rayHit);
    121.  
    122.                 // cheap way to reduce mip map artifacts on edge
    123.                 rayHit = rayHit < 0.0 ? dot(i.ray, -objectSpaceCameraPos) : rayHit;
    124.  
    125.                 // calculate object space position from ray, front hit ray length, and ray origin
    126.                 float3 surfacePos = i.ray * rayHit + objectSpaceCameraPos;
    127.  
    128.                 // object space surface normal
    129.                 float3 normal = normalize(surfacePos);
    130.  
    131.             #if defined(_MAPPING_CUBEMAP)
    132.                 // cubemap uvw
    133.                 float3 uvw = surfacePos;
    134.                 // swizzle & invert cubemap UVW so it matches equirectangular
    135.                 uvw.xz = -uvw.zx;
    136.  
    137.                 // sample cube map
    138.                 fixed4 col = texCUBE(_CubeTex, uvw);
    139.             #else
    140.                 // -0.5 to 0.5 range
    141.                 float phi = atan2(normal.z, normal.x) / (UNITY_PI * 2.0);
    142.  
    143.                 // 0.0 to 1.0 range
    144.                 float phi_frac = frac(phi);
    145.  
    146.                 // negate the y because acos(-1.0) = PI, acos(1.0) = 0.0
    147.                 float theta = acos(-normal.y) / UNITY_PI;
    148.  
    149.                 // construct the uvs, selecting the phi to use based on the derivatives
    150.                 float2 uv = float2(
    151.                     fwidth(phi) < fwidth(phi_frac) ? phi : phi_frac,
    152.                     theta
    153.                     );
    154.  
    155.                 // sample the equirectangular texture
    156.                 fixed4 col = tex2D (_MainTex, uv);
    157.             #endif
    158.  
    159.                 // basic lighting
    160.                 half3 worldNormal = UnityObjectToWorldNormal(normal);
    161.                 half ndotl = saturate(dot(worldNormal, _WorldSpaceLightPos0.xyz));
    162.                 half3 ambient = ShadeSH9(float4(worldNormal, 1));
    163.                 col.rgb *= _LightColor0 * ndotl + ambient;
    164.  
    165.                 // output modified depth
    166.                 float4 clipPos = UnityObjectToClipPos(float4(surfacePos, 1.0));
    167.                 depth = clipPos.z / clipPos.w;
    168.  
    169.                 return col;
    170.             }
    171.             ENDCG
    172.         }
    173.     }
    174. }

    One last thing I noticed, another issue with equirectangular mapping. At the poles, because the UVs collapse to an infinitely small point, the mip map issue remains. If you look carefully there's a small dot that appears on the pole that isn't there in the cubemap based (left most) sphere.
    upload_2020-12-30_22-45-40.png
     
  12. hecubah

    hecubah

    Joined:
    Jul 30, 2017
    Posts:
    5
    Happy New Year! I must admit, had to read up and experiment with the cases you have shown here - especially mip maps and creating my own cubemaps. My argument fails exactly on the poles. The spots you mentioned are showing even without mipmapping, so to get a clear textured surface, cubemaps really seem the best way. I was afraid that the effort to create cubemaps procedurally wouldn't be actually worth it, but it is surprisingly easy. Also did some ratio calculations of how much information needs to be provided to area in both equirectangular and cubemap projection and for a larger amount of textured spheres, equirectangular textures use too much memory (relatively).

    If it would be of interest to anybody, I could add my derivations to this thread, as I could not find any compact document on the subject.
     
    bgolus likes this.