Search Unity

Trying to understand why Mathf.Perlin causes artefacts when seed offset too large

Discussion in 'Scripting' started by chmodseven, Apr 23, 2019.

  1. chmodseven

    chmodseven

    Joined:
    Jul 20, 2012
    Posts:
    120
    Hi, I've been generating terrain height maps using Mathf.PerlinNoise and have added in a seed offset, which seems to work fine up to a point. Once the value goes beyond 65535, I start seeing ridging artefacts in the terrain generated, and if the value gets sufficiently large then the entire terrain just flattens out.

    Here's a terrain with seed 65535 which is still good:



    And here's the next seed over at 65536 showing ridges:



    Here's the code being used. It assumes a square heightmap of the resolution passed in, and the seed is hardcoded for now to bring up the test case, but the float array is just being put into a regular terrain heightmap elsewhere. (Note: the texel scalar is used here so that if the same sized region is generated with a different resolution, it'll still look the same only more or less detailed - I don't believe this to be implicated in the max-seed-size issue, though; the main issue seems to be with the seed offset pushing the overall x/z values beyond the limit of a short.)

    Code (CSharp):
    1.     public float [,] GenerateHeightmap (int heightmapResolution)
    2.     {
    3.         int seedOffset = 65536;
    4.         float heightmapTexelScalar = 1f / (heightmapResolution - 1);
    5.         float [,] heights = new float[heightmapResolution, heightmapResolution];
    6.  
    7.         for (int x = 0; x < heightmapResolution; x++)
    8.         {
    9.             for (int z = 0; z < heightmapResolution; z++)
    10.             {
    11.                 heights [x, z] = Mathf.PerlinNoise (seedOffset + x * heightmapTexelScalar, seedOffset + z * heightmapTexelScalar);
    12.             }
    13.         }
    14.  
    15.         return heights;
    16.     }
    17.    
    Any help understanding what's going on here, or if there are any Mathf.PerlinNoise limits that I've not yet been able to find documented anywhere, will be appreciated!
     
  2. kdgalla

    kdgalla

    Joined:
    Mar 15, 2013
    Posts:
    4,639
    I suppose it's not coincidence that 65535 is the maximum value you can store in an unsigned 16-bit int. Could it be some sort of overflow problem? What is your height map resolution?
     
    lordofduct likes this.
  3. lordofduct

    lordofduct

    Joined:
    Oct 3, 2011
    Posts:
    8,539
    I don't know off the top of my head, I don't know Unity's specific implementation or anything.

    But, 65536 is 2^16. That's a 2-byte value which makes me bet that has something to do with it. There is probably some sort of sig-value cut off going on causing the issue.

    Looking at general implementations out there like here on wiki:
    https://en.wikipedia.org/wiki/Perlin_noise

    They all seem to use 2d lookup tables. If you have a packed 32-bit table, that'd be a 16x16 2d grid. Which makes me think that might be where the 65536 is coming from.
     
  4. chmodseven

    chmodseven

    Joined:
    Jul 20, 2012
    Posts:
    120
    In those examples posted, I had 257 as the heightmap resolution for testing.

    I've seen the same issue, however, occur with smaller seeds if I'm using a larger resolution. For example if the res is 4097 then the limit seems to be seed 4095 before the glitch starts at seed 4096. So there seems to be a correlation of about 3 bytes total if you multiply these out as res*seed (e.g. 4096 * 4096 = same as 256 * 65536). Don't know if that means anything internally or not.