Search Unity

SEGI (Fully Dynamic Global Illumination)

Discussion in 'Assets and Asset Store' started by sonicether, Jun 10, 2016.

  1. LennartJohansen

    LennartJohansen

    Joined:
    Dec 1, 2014
    Posts:
    2,394
    For the added effect that sounds good. 3 ms should not be a problem for most non VR titles.
     
    RB_lashman likes this.
  2. gian-reto-alig

    gian-reto-alig

    Joined:
    Apr 30, 2013
    Posts:
    756
    Some questions before I throw my money in your direction:

    - Performance vs. Enlighten? Better, worse? By how much? (I guess not needing SSAO helps here)
    - Indirect shadows are mentioned. What about direct shadows from the lights it currently support (one directional light as far as I understand it)?
    - Is it playing nice with assets like Time of Day? Could I adapt Time of day to SEGI (Time of day is using a reflection probe and is updating the skybox colors)?
    - Any ETA for support of more lights besides a single directional light?
    - What about support for big objects (like large terrains)? Currently a huge pain in the A** with Enlighten. How does SEGI handle that? Does it need to prevoxelize the whole visible area, or is it just a set area around the camera?


    Looking forward to being able to kick that trainwreck called Enlighten into the trashbin :)
     
  3. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    Well as far a reading the thread goes:
    - enlighten is faster, but also not real time (ie precompute light path on static geometry) and take long time to bake. Enlighten is real time therefore handle dynamic objects, but it's costly on a per frame fashion.
    - technically it support infinite light, emissive are taken into account, the one light direction was a demo made by a user as a test.
    - large space are not yet handled, it need cascading mipmap something that unity don't currently have but is coming (or unity just had it and it hasn't made it yet).
     
  4. DivergenceOnline

    DivergenceOnline

    Joined:
    Apr 19, 2015
    Posts:
    244
    Hmmmm ok so the reflections I see then are not SSR and are (my guess) you actually being able to have "reflections" turned on for SEGI, which just breaks the whole system for me unfortunately.
     
  5. scheichs

    scheichs

    Joined:
    Sep 7, 2013
    Posts:
    77
    SEGI needs no prebake (no precalculated space on disk) and works excellent with dynamic geometry AND you get area lights and REAL reflections (lower quality).
    The drawback is that is more costly than enlighten at runtime (but enlighten only working on static geometry) and the region receiving GI is not THAT large at the moment... see roadmap.
     
    blueivy and RB_lashman like this.
  6. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    I MADE A MISTAKE I said enlighten twice when I meant segi the second, thanks for the correction lol :p
     
    RB_lashman likes this.
  7. sonicether

    sonicether

    Joined:
    Jan 12, 2013
    Posts:
    265
    Yeah, that's definitely a possibility! Check out the public variable updateGI in SEGI.cs. When this is false, voxelization does not occur. So, using your own scripts, you can decide when voxelization is performed or not.

    1. Enlighten is really really fast once it's been precomputed, so it's definitely slower than Enlighten at runtime.
    2. I'm not sure what you mean, but directional light injection is done with proper shadows, if that's what you're wondering.
    3. Check out the public variable skyColor in SEGI.cs. You can set this from other scripts if you have some other way of determining what the skylight color should be based on the current time of day in the scene.
    4. The core algorithm still needs a lot of work, so it'll probably be a while of optimization and improvements of what's already there before I consider adding new features like indirect lighting from point and spot lights.
    5. Like others have said, scene scale is still quite limited. There is a volume inside which everything is voxelized for GI, and by default it follows the camera that SEGI is attached to. You can adjust the size of the volume, but don't expect to be able to crank it up and still retain close-proximity detail in GI. Voxel cascades should help this dramatically, and my initial work on it seems promising.
     
  8. DivergenceOnline

    DivergenceOnline

    Joined:
    Apr 19, 2015
    Posts:
    244
    So what do you plan to do about SEGI voxelizing all grass and trees?
     
  9. buttmatrix

    buttmatrix

    Joined:
    Mar 23, 2015
    Posts:
    609
    Definitely an interesting asset. How does one address the distance culling issues seen in the images below (?)
     

    Attached Files:

  10. eskovas

    eskovas

    Joined:
    Dec 2, 2009
    Posts:
    1,373
    That looks like it's the normal Unity's Shadow distance and not SEGI. At least it's what it looks like. SEGI doesn't do direct shadows afaik.
     
    buttmatrix, Vadya_Rus and RB_lashman like this.
  11. LennartJohansen

    LennartJohansen

    Joined:
    Dec 1, 2014
    Posts:
    2,394
    How is this implemented? Is it a full screen effect added to the camera? or do I have to change to custom shaders on my meshes?
     
  12. lazygunn

    lazygunn

    Joined:
    Jul 24, 2011
    Posts:
    2,749
    Full screen effect. I've tried a good variety of shader assets with it and it works fine. If you are making a stylized kinda game, super arty and stuff, this can run perfectly well, even in VR. I was messing around with the polyworld demos last night just to see how this went and the results were very good after some tweaking, keeping above 60fps on a 970 in VR anyways
     
    buttmatrix, elbows and RB_lashman like this.
  13. lazygunn

    lazygunn

    Joined:
    Jul 24, 2011
    Posts:
    2,749
    Also, seriously, yes its reasonable to consider using proxies for the GI - the probuilder video was a great sense of how you might be using this effect, even early, as in probuilder you have absolute control of your GI complexity and can mess around with it without leaving the editor. Keep your GI geometry very low poly and you'll be good i think. Unless you really are wanting to see the reflection of a leaf against a hand in that fancy game
     
    RB_lashman and blueivy like this.
  14. LennartJohansen

    LennartJohansen

    Joined:
    Dec 1, 2014
    Posts:
    2,394
    Will buy a copy and do some tests
     
    RB_lashman likes this.
  15. DivergenceOnline

    DivergenceOnline

    Joined:
    Apr 19, 2015
    Posts:
    244
    The probuilder video was great to show that SEGI works for game scenes 10m in size and made out of primitives, just like every other video pushed out for it over the last year and a half.
     
    macdude2 likes this.
  16. makeshiftwings

    makeshiftwings

    Joined:
    May 28, 2011
    Posts:
    3,350
    If my game itself already has a voxel system is there any way to use that existing data rather than have SEGI revoxelize all the meshes? I'd guess this isn't something that can be easily plug and play but maybe if I dig into the source code? I have buildable chunks of the game where I already convert a bunch of voxel data into a chunk mesh on chunk rebuilds; it would be cool if I could just pass that data to SEGI somehow rather than having it do the same sort of thing immediately afterwards.
     
    RB_lashman likes this.
  17. DivergenceOnline

    DivergenceOnline

    Joined:
    Apr 19, 2015
    Posts:
    244
    SEGI isn't a "voxel system" and it doesn't "voxelize meshes".
    Unless SE wants to correct me on this, i think you've misunderstood the product.
     
  18. makeshiftwings

    makeshiftwings

    Joined:
    May 28, 2011
    Posts:
    3,350
    You're complaining about how it voxelizes the tree meshes like three posts up. :p

    As I understand it, it voxelizes the mesh data in the bounding volume to determine what to light; I'm asking if I can just give it that voxel data myself rather than doing voxel -> mesh -> voxel every frame.
     
    RB_lashman likes this.
  19. DivergenceOnline

    DivergenceOnline

    Joined:
    Apr 19, 2015
    Posts:
    244
    I see what you mean.
    I didn't mean "turns things into voxels". It creates its own voxels using game scene geometry as a guide. Imagine filling a tub with golf balls shaped like cones. The golf balls are SEGI voxels. When you put a hockey-stick into the tub, SEGI doesn't turn the hockey stick into voxels, it creates its voxels around the exterior of it in the game scene. Whereas in "voxel systems" for Unity, the terrain or objects themselves are "made out of voxels".

    You're talking about two things; providing data to voxelize yourself and preventing it from voxelizing every frame.
    The first part is already what they're doing now; providing proxy data for voxelizing. There's an area of the guide with that info. The second part is preventing it from doing this every frame. This is something he's already doing so probably not worth your time unless you're super desperate for it right now. Trust me, I've already raised 99% of the legal limit of hell about it he knows it's super-important.

    If you're instead talking about taking your existing voxel data and converting it into segi voxel data to prevent segi from doing the lifting, man that sounds like a massive amount of work that I really don't think is worth it for you. This is why i said segi wasn't a "voxel system" because thats the same as saying both our games are "pixel systems". Can you convert one type of pixel into another? Probably? But there are several "voxel systems" available already, obviously including whichever one you use, and their "voxel data" is vastly-different even between themselves. Thus, unless he says something to the contrary, it's reasonable to assume they're completely alien formats to one another and by the time you got halfway there SEGI performance would be under control and usable as-is.
     
  20. makeshiftwings

    makeshiftwings

    Joined:
    May 28, 2011
    Posts:
    3,350
    That makes sense. I guess it depends on how complicated the SEGI voxel system is, and I haven't really looked at it. If it's just a 3D array with a bit for "solid" or "empty", then I could generate that easily from my existing voxel data instead of having SEGI calculate it based on raycasts or whatever it's doing. But if the SEGI voxels contain color or texture data then it's probably too much work to port.
     
    RB_lashman likes this.
  21. chiapet1021

    chiapet1021

    Joined:
    Jun 5, 2013
    Posts:
    605
    Directional voxel data to prevent light leaking (which is pretty obvious in some portions of the current demos) is on SE's roadmap. I don't claim to know what that means exactly (although I have a couple ideas), but I'm assuming that will be something that makes it even less likely that SEGI can play well with an existing voxel system.

    Even if it were possible, I would think (and hope) it would come after most/all of the current roadmap items. Accepting another system's voxel data seems very niche, whereas there's a lot of baseline functionality not yet implemented that is applicable to a much wider customer base.
     
    blueivy and RB_lashman like this.
  22. makeshiftwings

    makeshiftwings

    Joined:
    May 28, 2011
    Posts:
    3,350
    Yeah, I don't expect or want him to develop some sort of plug and play voxel API or anything. I was just wondering if I could get it working on my own by modifying the source code. I haven't really looked at what the voxel data is... I was imagining that it was just whether or not a space was "empty" and thus propagating light. But if it's doing something more complicated like calculating the light at each voxel and storing it, then yeah, it's probably pointless to mess with it. ;)
     
    RB_lashman likes this.
  23. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
    maybe we need an additional voxel api to let user extend segi as they see fit?
     
    RB_lashman likes this.
  24. chiapet1021

    chiapet1021

    Joined:
    Jun 5, 2013
    Posts:
    605
    Let's let Cody clear out that long list of to-do's first, please.
     
    blueivy and RB_lashman like this.
  25. nipoco

    nipoco

    Joined:
    Sep 1, 2011
    Posts:
    2,008
    Any chance we'll see a demo version with watermark? So we can try it out with own projects.

    I tried the standalone demos and they run always above 60fps on my GTX970. But I wondering how well this works with moving characters? I assume you need a quite high resolution voxel grid to get characters properly lit?
     
  26. DivergenceOnline

    DivergenceOnline

    Joined:
    Apr 19, 2015
    Posts:
    244
    It's always better to read the posts on the thread before you pose a question incase it's already answered.
    Your question is exactly what's been discussed and debated this whole time more or less.
     
  27. nipoco

    nipoco

    Joined:
    Sep 1, 2011
    Posts:
    2,008
    Oh really? I'm sorry. Unfortunately, I don't have much time to read everything here, since I'm quite busy with work.
    I couldn't find much valuable information about skinned characters support, nor official words from @sonicether about a possibly watermarked demo version.
    Maybe you can point me to these posts.
     
    Acissathar likes this.
  28. neoshaman

    neoshaman

    Joined:
    Feb 11, 2011
    Posts:
    6,493
     
    nipoco and RB_lashman like this.
  29. sonicether

    sonicether

    Joined:
    Jan 12, 2013
    Posts:
    265
    Your shadow distance is too low, increase the shadow distance in the project quality settings.

    You can use any shader that works with the deferred pipeline (the Standard shader for instance). And yes, it's an image effect added to the camera that uses the deferred gbuffers to add the GI.

    We would have to find an efficient way of building a 3D render texture from data on the CPU that's faster than rasterization. There's a lot to be worked on with SEGI before I can look into this further.

    I would advise against having characters voxelized (you can choose what's voxelized and what's not with a culling mask in SEGI's settings), though that doesn't mean that they won't receive GI, it just means that they won't contribute to indirect lighting. In the future, I'd like to look into a separate approximate solution for character indirect shadows, but for now something like SSAO might be enough to ground the characters in the environment.
     
    nipoco, RB_lashman and eskovas like this.
  30. nipoco

    nipoco

    Joined:
    Sep 1, 2011
    Posts:
    2,008
    @neoshaman

    Yeah thanks! I already saw that. Unfortunately it doesn't give me much information how the GI contribute to moving characters. Looks a bit too dark.

    Thanks a bunch! Exactly the info I was looking for.
    It would be really great if you could get a separate GI approximation for characters working.
    I know most people here are interested in VR and large terrain support. Personally, I'm more interested in characters with self indirect lighting.
    But having a character receiving GI like Unity's light probes is fine enough for now.
     
    Last edited: Jul 4, 2016
  31. DivergenceOnline

    DivergenceOnline

    Joined:
    Apr 19, 2015
    Posts:
    244
    SE makes SESSAO which has color bleed and that's probably exactly what you're looking for because it looks almost like GI and you can apply it to your characters. There's an ongoing bug with it though if you use detail objects attached to your terrain so if you can't use SESSAO the HBAO one works just as well at approximating the same result.
     
  32. sonicether

    sonicether

    Joined:
    Jan 12, 2013
    Posts:
    265
    Once SEGI has screen-space tracing, you'll be able to use that. The great thing is that it'll be completely contextual based on the indirect lighting instead of a fake general darkening of corners.
     
    hopeful and RB_lashman like this.
  33. mgear

    mgear

    Joined:
    Aug 3, 2010
    Posts:
    9,411
    Just a random thought, not knowing how this magic actually works, but could Segi be used for baking eventually..?
    (since its so fast, could just define areas that player can enter, let 360' camera fly around it and bake the data in at ultra quality?)
     
  34. sonicether

    sonicether

    Joined:
    Jan 12, 2013
    Posts:
    265
    Well, I was waiting to reply to questions until the forum rollback was done, and I suppose it's a good thing I did. Though, it's a shame those posts were lost. If you have any questions that I didn't get a chance to answer, let me know.

    @mgear Baking GI is in the development roadmap.
     
    N00MKRAD and mgear like this.
  35. Mauri

    Mauri

    Joined:
    Dec 9, 2010
    Posts:
    2,664
    Since the forums have been reverted back and data has been lost (as warned before), here's scheichs's Helper script once again:

    Code (CSharp):
    1.  
    2. using UnityEngine;
    3. using System.Collections;
    4.  
    5. public class SEGIHelper : MonoBehaviour {
    6.     public float updateIntervall = 1;
    7.     float nextUpdate = 1;
    8.     public SEGI segi;
    9.     public bool warmUp = true;
    10.  
    11.  
    12.     // Update is called once per frame
    13.     void Update () {
    14.         if (segi.updateGI && !warmUp)
    15.             segi.updateGI = false;
    16.         nextUpdate -= Time.deltaTime;
    17.         if (nextUpdate < 0)
    18.         {
    19.             segi.updateGI = true;
    20.             nextUpdate = updateIntervall;
    21.             warmUp = false;
    22.         }
    23.     }
    24.  
    25.     void OnDisable()
    26.     {
    27.         segi.updateGI = true;
    28.     }
    29. }
    30.  
    (which I originally posted on Pastebin)


    Edit: There's an updated version, posted by @VertexSoup, here.
     
    Last edited: Jul 13, 2016
  36. eskovas

    eskovas

    Joined:
    Dec 2, 2009
    Posts:
    1,373
    Re-posting here 2 things i posted on the old forums :)
    First, the couple of images showing SEGI in action, and second, a simple technique that can be used to generate proxy terrain and proxy trees from your terrain that can be used for SEGI. (should not be used as is and you should improve it before using it if you do so )

    --

    Here are a couple of images using SEGI with wet surfaces in a small section of the new map of my game :D
    Awesome quality and lightning fast results (for the editor and my pc at least), and best of all, not having to wait several minutes for the pre-computation to finish.

    Effects used here: SEGI, SCION and SMAA. nothing more.

    SEGI4.png
    SEGI.jpg

    ---

    Here's the script that generates to proxy terrain and proxy trees that can be used with SEGI for faster performance. (Improve it for your needs before using it)

    This took me a couple of hours to implement.
    It took 2 hours, so it's not perfect and the resultant mesh needs to have less than 65k triangles, or it won't work. Although, implementing multiple meshes per terrain is not hard, but i can't spend much more time on this.

    I calculated the colors per texture by averaging all pixel values from each texture, but somehow the result is a little weird, so i added a layer intensity for that. You'll have to fix that. Since Unity doesn't have a vertex shader, i include here a very simple vertex shader. It doesn't support a few things, so you need to add them yourself. And if you want textures instead of vertex color, it's not that hard to implement. Just was just simpler to do to demonstrate that it works :)

    So, here is an image, left side is the normal terrain and the right is the proxy terrain and trees generated on startup. Proxy terrain and proxy trees are seperated and can be used on GI. Left side, the trees belong to the terrain using the terrain system, and don't need to be used for the GI.
    proxyTerrain.jpg

    The Setup is just to attach the script onto the terrain, and on the Start function, it will generate the proxy mesh. If you want, you could also modify it to an Editor script if it's more convenient. Here are the settings:
    proxyTerrain_Settings.jpg

    In the settings, you can set the Distance per vertex, where is sets the distance between vertices of the proxy mesh, then you set the material of the terrain, which has the vertex shader. Then you set the intensity per texture layer, and finally you set the proxy tree prefab.

    Here is the terrainProxy script:
    Code (CSharp):
    1. using UnityEngine;
    2. using System.Collections;
    3. using System.Collections.Generic;
    4.  
    5. public class TerrainProxy : MonoBehaviour
    6. {
    7.  
    8.     public float DistancePerVertex = 1;
    9.  
    10.     public Material terrainProxyMaterial;
    11.  
    12.     public float[] layerIntensity;
    13.     public GameObject[] treesPrefabs;
    14.  
    15.     private int GILayer = 1; //LayerMask.GetMask("GI");
    16.  
    17.     private Terrain terrain;
    18.     private TerrainData terrainData;
    19.  
    20.     private Transform proxyTerrainParent;
    21.     private Transform proxyTreesParent;
    22.  
    23.     private GameObject proxyTerrain;
    24.     private GameObject proxyTrees;
    25.  
    26.     void Start ()
    27.     {
    28.         terrain = GetComponent<Terrain>();
    29.         terrainData = terrain.terrainData;
    30.  
    31.         GenerateTerrainProxy();
    32.         GenerateTreesProxy();
    33.     }
    34.  
    35.     private void GenerateTerrainProxy()
    36.     {
    37.         proxyTerrainParent = new GameObject("Terrain Proxy").transform;
    38.         proxyTerrainParent.position = transform.position;
    39.  
    40.         Vector3 terrainSize = terrainData.size;
    41.         int terrainResolution = terrainData.heightmapResolution;
    42.      
    43.         //Calculate the number of increments and vertices on both X and Z axes.
    44.  
    45.         int numberVerticesX = (int)(terrainSize.x / DistancePerVertex) + 1;
    46.         int numberVerticesZ = (int)(terrainSize.z / DistancePerVertex) + 1;
    47.  
    48.         int totalNumberVertices = numberVerticesX * numberVerticesZ;
    49.  
    50.         //Needed variables to construct Mesh
    51.         List<Vector3> vertices = new List<Vector3>(totalNumberVertices);
    52.         List<int> triangles = new List<int>((totalNumberVertices + 2) * 3);
    53.  
    54.         List<Color> colors = new List<Color>(totalNumberVertices);
    55.  
    56.  
    57.         //Calculate center color of each texture
    58.         SplatPrototype[] textures = terrainData.splatPrototypes;
    59.         Color[] textureCenterColor = new Color[textures.Length];
    60.         for (int i = 0; i < textures.Length; i++)
    61.         {
    62.             Texture2D tex = textures[i].texture;
    63.  
    64.             Color color = Color.black;
    65.             Color[] allColors = tex.GetPixels(); // Get Everything at once for faster perf
    66.  
    67.             for(int c = 0; c < allColors.Length; c++)
    68.             {
    69.                 color += allColors[c];
    70.             }
    71.  
    72.             color /= allColors.Length;
    73.  
    74.             textureCenterColor[i] = color* layerIntensity[i];
    75.         }
    76.  
    77.  
    78.         int currentVertex = 0;
    79.  
    80.         //Construct Vertices
    81.         for (float x = 0; x < terrainSize.x + DistancePerVertex; x+= DistancePerVertex)
    82.         {
    83.             for(float z = 0; z < terrainSize.z + DistancePerVertex; z+= DistancePerVertex)
    84.             {
    85.                 float xClamp = Mathf.Clamp(x, 0, terrainSize.x);
    86.                 float zClamp = Mathf.Clamp(z, 0, terrainSize.z);
    87.  
    88.  
    89.                 int heightX = (int)((xClamp / terrainSize.x) * terrainData.heightmapWidth);
    90.                 int heightZ = (int)((zClamp / terrainSize.z) * terrainData.heightmapHeight);
    91.  
    92.                 float height = terrainData.GetHeight(heightX, heightZ);
    93.  
    94.                 Vector3 vertex = new Vector3(xClamp, height, zClamp);
    95.  
    96.                 //Get AverageColorOfPoint
    97.  
    98.                 int splatX = (int)((xClamp / terrainSize.x) * terrainData.alphamapWidth);
    99.                 int splatZ = (int)((zClamp / terrainSize.z) * terrainData.alphamapHeight);
    100.  
    101.                 if (splatX >= terrainData.alphamapWidth) splatX -= 1;
    102.                 if (splatZ >= terrainData.alphamapHeight) splatZ -= 1;
    103.  
    104.                 float[,,] texts = terrainData.GetAlphamaps(splatX, splatZ, 1, 1);
    105.              
    106.                 Color color = Color.black;
    107.                 for(int i = 0; i < textures.Length; i++)
    108.                 {
    109.                     color += texts[0, 0, i] * textureCenterColor[i];
    110.                 }
    111.  
    112.                 vertices.Add(vertex);
    113.                 colors.Add(color);
    114.  
    115.                 currentVertex++;
    116.             }
    117.         }
    118.  
    119.         //Construct Triangles
    120.         int vLength = vertices.Count - numberVerticesZ;
    121.  
    122.         for (int x = 0; x < vLength; x+= numberVerticesZ)
    123.         {
    124.             for(int z = 0; z < numberVerticesZ - 1; z++)
    125.             {
    126.                 //Left Tri
    127.                 triangles.Add(x + z);
    128.                 triangles.Add(x + z + 1);
    129.                 triangles.Add(x + z + numberVerticesZ);
    130.  
    131.                 //Right Tri
    132.                 triangles.Add(x + z + 1);
    133.                 triangles.Add(x + z + numberVerticesZ + 1);
    134.                 triangles.Add(x + z + numberVerticesZ);
    135.  
    136.             }
    137.         }
    138.  
    139.         //Generate the actual Geometry
    140.  
    141.         proxyTerrain = new GameObject("Terrain Proxy");
    142.  
    143.         proxyTerrain.transform.parent = proxyTerrainParent;
    144.  
    145.         proxyTerrain.layer = GILayer;
    146.  
    147.         Mesh mesh = new Mesh();
    148.  
    149.         mesh.SetVertices(vertices);
    150.         mesh.SetTriangles(triangles, 0);
    151.         mesh.SetColors(colors);
    152.  
    153.         mesh.Optimize();
    154.      
    155.  
    156.         MeshFilter meshFilter = proxyTerrain.AddComponent<MeshFilter>();
    157.         meshFilter.mesh = mesh;
    158.  
    159.         MeshRenderer renderer = proxyTerrain.AddComponent<MeshRenderer>();
    160.         renderer.material = terrainProxyMaterial;
    161.     }
    162.  
    163.  
    164.     private void GenerateTreesProxy()
    165.     {
    166.         proxyTreesParent = new GameObject("Terrain Trees Proxy").transform;
    167.         proxyTreesParent.position = transform.position;
    168.      
    169.         TreeInstance[] treeInstances = terrainData.treeInstances;
    170.  
    171.         for(int i = 0; i < treeInstances.Length; i++)
    172.         {
    173.             TreeInstance tree = treeInstances[i];
    174.  
    175.             GameObject treePrefab = treesPrefabs[tree.prototypeIndex];
    176.  
    177.             Transform treeProxy = GameObject.Instantiate(treePrefab).transform;
    178.  
    179.             treeProxy.gameObject.layer = GILayer;
    180.  
    181.             treeProxy.parent = proxyTreesParent;
    182.             treeProxy.position = transform.position + Vector3.Scale(tree.position, terrainData.size);
    183.             treeProxy.localEulerAngles = new Vector3(0, tree.rotation, 0);
    184.             //treeProxy.localScale = new Vector3(tree.widthScale, tree.heightScale * terrainData.size.y, tree.widthScale);
    185.         }
    186.     }
    187. }
    Here is the shader used:
    Code (CSharp):
    1. Shader "Custom/Terrain Proxy Vertex Shader"
    2. {
    3.     SubShader
    4.     {
    5.         Pass
    6.         {
    7.             ColorMaterial AmbientAndDiffuse
    8.             Lighting On
    9.         }
    10.     }
    11.     Fallback "VertexLit", 1
    12. }
    Notes:
    - To be able to read the information on the textures, you need to set the textures to readable in their settings.
    - This script doesn't modify or delete the original Terrain or any tree. The script generates a completely new gameObject with the proxy mesh and instantiates the proxy trees at the correct positions. Then you can set the terrain to be excluded from the GI, and use the proxy objects for the GI voxelization.
    - Another solution for the colors would be to instead of calculating the average color of each texture and adding intensity to it, you could just set an array of colors that you can set on the inspector and use those colors. Something that could be improved in this :)

    If you have any questions, go ahead and ask. Just remember this is a SEGI thread and i made this to show how to generate a proxy terrain and proxy trees from your terrain.
    It's not perfect, so you need to modify it to your needs and improve it.
     
  37. scheichs

    scheichs

    Joined:
    Sep 7, 2013
    Posts:
    77
    Repost:
    Made a small SEGI test with a script (see above) that enables SEGI's updateGI property 1 frame every second. This way one can simulate a simple realtime GI bake (called "Static GI" in the demo). Scene 1 has around 350k triangles in camera frustum, Scene 2 around 50k triangles.
    You can toggle the script functionality on/off to see the difference. On my GTX 970 at 1080p speed gos up from around 30 to 110FPS.
    In the demo only one light for the light sword is used. All other lights are emissve materials.

    Reposting SEGI test video


    And here's the app to try it out on your system (~ 73MB)
    https://www.dropbox.com/s/ekn0p4laeghetsp/SEGI.zip?dl=0

    This was just a quick test with an asset from the Asset Store (from which I removed the pre-baked lightmaps with a size of around 500MB ) and added SEGI and PRISM.
     
    Last edited: Jul 11, 2016
  38. eskovas

    eskovas

    Joined:
    Dec 2, 2009
    Posts:
    1,373
    Is it at all possible to have the voxelization spread-out through multiple frames instead of every frame?

    I don't know how it is implemented behind the scenes, so sorry if i say something that makes no sense :)

    I believe SEGI does it's own Camera.Render to render the whole scene at once.
    Would it be feasible, for example, let's say we want to do the voxelization spread-out through 5 frames. So before that Camera.Render every frame, it will change the near and far clipping planes of the camera to render a specific part of the scene.
    For example, 1st frame it does min - 20, 2nd does 20-40, 3r does 40-100 and so on. Or other tactic would be to do incrementations of far/5 for every frame every 5 frames.
    Then the Voxelization algorithm would sort those new voxels generated.

    Just a thought i had in my mind. If i didn't explain it well or it doesn't make sense for SEGI let me know :D
     
    sirleto and RB_lashman like this.
  39. makeshiftwings

    makeshiftwings

    Joined:
    May 28, 2011
    Posts:
    3,350
    Should SEGI be used with SSAO and SSRR? I know the Global Illumination already gives occlusion and reflections, but do people recommend also adding SSAO and/or SSRR as well? I can't decide if it looks better or not with them on.
     
    RB_lashman likes this.
  40. eskovas

    eskovas

    Joined:
    Dec 2, 2009
    Posts:
    1,373
    Well, it kind of depends on what you want. The reflections SEGI does are of voxels, but, if you are ok with not pixel-perfect reflections, then SEGI will work very well. You can see my image above for reference, it doesn't use SSAO or SSRR. just remember not to have very high reflective surfaces, or you'll see the voxels.

    SSAO isn't really needed for the environment, since SEGI also produces a similar effect, but it can be good for grounding characters and other things like small objects. Kind of depends on what you want or willing to not have.

    Only way to see it is by testing it. Everything comes down to the necessities of each game.

    Check Sonic's new video demonstrating the capabilities of SEGI. He shows things like the reflections and so on:
     
  41. nipoco

    nipoco

    Joined:
    Sep 1, 2011
    Posts:
    2,008
    Fantastic video.
    The fact that you also get sky occlusion and true emissive materials (better than area lights in a lot circumstances) should be more emphasized. Those are really great features too.
     
    RB_lashman and blueivy like this.
  42. LennartJohansen

    LennartJohansen

    Joined:
    Dec 1, 2014
    Posts:
    2,394
    I did a test video export from an apartment visualization project. Only SEGI doing the lighting. No unity lightmaps.
    Added Screen Space reflections.

    I miss spot and points lights but that is on the roadmap.

     
    RB_lashman likes this.
  43. SteveB

    SteveB

    Joined:
    Jan 17, 2009
    Posts:
    1,451
    Love that ProBuilder demo video. So simple, so GI yo... :D
     
    RB_lashman likes this.
  44. LennartJohansen

    LennartJohansen

    Joined:
    Dec 1, 2014
    Posts:
    2,394
    Does baking GI allow me to use these maps as normal baked lightmaps in Unity? That would give a huge quality increase for procedural levels, and for VR it would render with "no extra" cost...

    Lennart
     
    RB_lashman likes this.
  45. local306

    local306

    Joined:
    Feb 28, 2016
    Posts:
    155
    Firstly, this asset is incredible! Money well spent indeed :)

    I had some fun testing out a couple of things as seen below (SE Bloom and SEGI were used):





    I do have a question though. I'm working with UFPS with my current project. They use two cameras; one renders the scene, and the other renders the gun (clears depth and draws on top). Does anyone know of a way to apply SEGI to the main camera, and then SE Bloom to the second camera when both cameras are set to deferred?

    I came across this older post which supposedly had a solution to this: http://forum.unity3d.com/threads/de...-posts-ssao-bugged-ufps-onrenderimage.198632/. Unfortunately, I haven't had any luck using the script they mentioned throughout their thread. Basically the second camera seems to be clearing all the effects of the first camera (the geometry is still their).

    I'm really hoping to keep the two camera setup since I prefer having the different FOVs per camera. When I set the gun camera to forward rendering, the main camera's SEGI effects show up, but I cannot get the SE Bloom to render on the weapon camera.

    Thanks in advance for the help ;)
     
    RB_lashman likes this.
  46. SteveB

    SteveB

    Joined:
    Jan 17, 2009
    Posts:
    1,451
    BTW @sonicether, would it be reasonable to have just skylighting for the game-world past the extents of the voxel volume? Clearly there's no need for bounce lighting, but it would be nice to have consistent ambient lighting. What currently happens is that while objects indeed take on the color of the sky, it's slightly darker than when lit inside the SEGI volume.

    This perhaps could be faked, but I'm concerned with conflicts with SEGI, so having it all-in-one (e.g. Scion) would be ideal.

    Then again, perhaps voxel cascades could take care of this? :D

    Thanks Sonic!

    -Steven
     
    RB_lashman likes this.
  47. buttmatrix

    buttmatrix

    Joined:
    Mar 23, 2015
    Posts:
    609
    From my experiments with SEGI so far. Definitely took some tuning and I found that more than just Unity's ambient lighting needs to be disabled in the lighting settings to get proper illumination. Performance isn't ideal, but definitely looks good. I should note that this scene is using very high quality settings, contains *only* emissive materials, and a post-processing stack including Amplify Bloom, SMAA, and color grading.
    Screenshot (210) - Copy2.png
     
  48. chiapet1021

    chiapet1021

    Joined:
    Jun 5, 2013
    Posts:
    605
    Beautiful, @buttmatrix. Mind if I ask what GPU you were running for that screenshot? Also, can you confirm you didn't have any AO post in your stack? The indirect shadows look really nice.
     
    RB_lashman likes this.
  49. buttmatrix

    buttmatrix

    Joined:
    Mar 23, 2015
    Posts:
    609
    Thanks! I am running a factory 970, i5 processor. No problem, I've posted images of my SEGI and lighting settings plus my effects stack (see below).

    I will admit that the connection between SEGI and SESSAO did not occur to me until after working with SEGI. Without SEGI, you can exaggerate SESSAO a lot and get a very similar image with much better performance. As you indicated, SEGI is really packing in the occlusion.
     

    Attached Files:

    • U1.PNG
      U1.PNG
      File size:
      507.5 KB
      Views:
      1,173
    • U2.PNG
      U2.PNG
      File size:
      308.3 KB
      Views:
      1,242
    • U3.PNG
      U3.PNG
      File size:
      286.8 KB
      Views:
      1,164
    chiapet1021, SteveB and RB_lashman like this.
  50. IronDuke

    IronDuke

    Joined:
    May 13, 2014
    Posts:
    132
    This guy raises a good point. @sonicether How easy will it be to get SEGI working when you use multiple cameras? My space game uses that to get high res shadows in the cockpit while still seeing things at a distance. Like would one be able to simply apply it to the top camera, and everything then has the GI?

    --IronDuke