Search Unity

[RELEASED] GPU Instancer

Discussion in 'Assets and Asset Store' started by LouskRad, May 3, 2018.

  1. itsnottme

    itsnottme

    Joined:
    Mar 8, 2017
    Posts:
    129
    Hey, I just bought the asset and I am getting an error "CameraType does not contain a definition for Preview" in GPUInstancerPreviewDrawer line 85.
    I am using Unity 2019.1
     
  2. GurhanH

    GurhanH

    Joined:
    Apr 24, 2017
    Posts:
    539
    Hi there,

    Shader is included in the package under GPUInstancer/Shaders/Internal-DepthNormalsTexture_GPUI.shader

    upload_2019-5-7_18-4-19.png

    If you are using an older version of GPUI, please update from the Asset Store Window and import the updated package to your project.

    GPU Instancing will help improve performance regardless of the rendering path. It does not matter which rendering path you use. You should decide on it according to your target platforms and graphics. With GPUI the important part is that you should use it for distinctively repeating prefabs (prefabs with high instance counts). Please read the Best Practices for further information on this.
     
  3. GurhanH

    GurhanH

    Joined:
    Apr 24, 2017
    Posts:
    539
    Hi there,

    It looks like you have a class called CameraType in your project that overrides the UnityEngine.CameraType.

    Best solution would be to change the name of this class or take it under a new namespace, so that it does not conflict with the UnityEngine.

    Another option would be to change the GPUInstancerPreviewDrawer code so that CameraType.Preview is referenced as UnityEngine.CameraType.Preview.

    But I recommend the first solution because overriding UnityEngine classes might also cause problems in other parts or assets in your project.
     
    itsnottme likes this.
  4. itsnottme

    itsnottme

    Joined:
    Mar 8, 2017
    Posts:
    129
    Thank you for the quick response, that fixed the issue. I am getting another error though when I am using the scene prefab importer.
     

    Attached Files:

    • 2.PNG
      2.PNG
      File size:
      62 KB
      Views:
      706
  5. GurhanH

    GurhanH

    Joined:
    Apr 24, 2017
    Posts:
    539
    Thank you for reporting the issue. We are aware of this problem with the prefab importer and an update will be available in the store in the next few days. You can contact us at support@gurbu.com with your invoice for obtaining the update package before Asset Store release.
     
  6. SickaGames1

    SickaGames1

    Joined:
    Jan 15, 2018
    Posts:
    1,270
    What is the difference between VS Pro and GPU Instancer from a speed stand point?
     
  7. LouskRad

    LouskRad

    Joined:
    Feb 18, 2014
    Posts:
    904
    Hi there,
    and thank you for your interest in GPUI.

    In terms of performance, we only have comparison data with Unity's default instancing system which you can check out from here: https://gurbu.com/performance
     
  8. J_k_Wilding

    J_k_Wilding

    Joined:
    Oct 8, 2017
    Posts:
    1
    Greetings, I have a quick question before I buy, currently I'm using Amplify Shaders + Impostors for my render pipeline, does gpu instancer have any conflicts with there's assets and can they be used together?

    thanks
    -J
     
  9. PawelGruntowski

    PawelGruntowski

    Joined:
    Dec 6, 2013
    Posts:
    7
    Hi,
    is there a way to have an occlusion culling in VR with cameras for right and left eye?
     
    Last edited: May 8, 2019
  10. LouskRad

    LouskRad

    Joined:
    Feb 18, 2014
    Posts:
    904
    Hi there,
    and thank you for your intererst in GPUI.

    GPUI does not have any known conflicts with both Amplify assets. In fact the included foliage shader has been written using ASE. GPUI will auto create a copy of the original ASE shader with the required setup and keep track of shader changes, but if you want to manually edit the original shader to make it compatible with GPUI, you can also see this wiki document.

    As for Amplify Impostors, you can use it as a Custom Billboard in your prototype's Billboard Generator, or you can simply add the impostor as an LOD level to your prefab. For more information on using GPUI with Amplify Impostors, you can take a look at this wiki document.
     
  11. LouskRad

    LouskRad

    Joined:
    Feb 18, 2014
    Posts:
    904
    Hi there,

    Out of the box, GPUI works with a single camera using "both eyes" for rendering in VR (this is the case for both single pass and multi pass rendering). The occlusion culling feature is supported for these setups, and works for both the right and the left eye.
     
  12. MadAboutPandas3

    MadAboutPandas3

    Joined:
    Jul 3, 2017
    Posts:
    29
    Hi,

    Thanks for the answer. I tried your demo scenes. The standard shader is supported, we use it in other projects. It looks like the hardware is not capable. :-(

    Kind Regards,
    Chris
     
  13. PawelGruntowski

    PawelGruntowski

    Joined:
    Dec 6, 2013
    Posts:
    7
    Hey,
    I have a strange problem with the GPU Instancer when I use it. Unity (2017.4.26f1). When the camera is located in the light zone of the GPU instancer to show you some strange half-transparent trees in bright colors at the camera position. GPUI bug 1.PNG GPUI bug no light.PNG
     
  14. iddqd

    iddqd

    Joined:
    Apr 14, 2012
    Posts:
    501
    Hi there

    I'm a happy GPUI customer with some info that might be interesting to you as devs:

    I'm using Map Magic Infinite Terrain and was getting annoying hiccups every time a new Terrain was generated (100ms). I compared to the Map Magic Island Demo and figured out that my issue was the Grass. I was using 5 different 2048px Grass textures, so if I downsized it to 512 the hiccups were no longer noticeable.

    This lag seems to be present whether I'm using GPUI or not, so I tried the following:
    1) use 2048 Grass in MM
    2) run the GPUI importer to import the grass
    3) then in the MM Graph, I set all Grass textures to "none" but left all connections there

    The result: The lag is gone and GPUI still generates infinite grass for all newly generated terrains.

    So I'm guessing when using GPUI, then MM should no longer set the grass textures on each terrain.

    As far as I know, I think you have a one-way MM integration and this is perhaps something that would require a change in MM. It would be great if you could take a look at this for a future update since it will be useful for many MM users I think.

    Until then, thanks for this great tool.
     
    LouskRad likes this.
  15. LouskRad

    LouskRad

    Joined:
    Feb 18, 2014
    Posts:
    904

    Hi there,
    it looks like the shader you are using for the tree that appears on the camera does not have the required GPU Instancing setup. If you are using a custom shader on that tree, you can take a look at this wiki document to see how you can setup that shader for use with GPUI.

    What you would particularly need is the UNITY_VERTEX_INPUT_INSTANCE_ID declaration in the vertex input struct and the UNITY_SETUP_INSTANCE_ID(v); call in the vertex/fragment method (for all passes).
     
  16. LouskRad

    LouskRad

    Joined:
    Feb 18, 2014
    Posts:
    904
    Hi there,
    and thank you for the feedback. The integration is indeed one-way and designed not to be intrusive to the Map Magic generator. We will take a look at this for future reference.
     
  17. reggie_sgs

    reggie_sgs

    Joined:
    Jun 29, 2014
    Posts:
    276
    I'm having an issue in one scene that has a large number of spot lights and no directional light. It seems the spot lights are a problem for GPU Instancer as the frame rate drops to about half of what it was before we added instancing. Is there any reason we should see a performance degradation with spot lights? If I disable the spot lights and just add a directional light, the performance is about 20% faster than without any instancing so it appears the scene will benefit from instancing but it just can't handle spotlights.

    Also, 2 other bugs I've found is that nested prefabs are not handled correctly and the prefab importer creates a new manager instead of adding to the existing manager in the scene. The first issue cropped up when I noticed some geometry wasn't rendering on one particular scene. It was a fence section and I discovered that a group of fence pieces had been parented to another fence piece (I guess to make it easier to move everything) and the prefab importer didn't recognize everything correctly. Unparenting the segments fixed the issue.
     
  18. iddqd

    iddqd

    Joined:
    Apr 14, 2012
    Posts:
    501
    By the way, this was with Unity 2018.3
     
  19. sebas77

    sebas77

    Joined:
    Nov 4, 2011
    Posts:
    1,643
    Hello again!

    two new questions:

    1) would make sense if I says that GPUI is slower than LWRP/UECS to render shadows on a intel HD 4000?
    2) are opaque objects sorted from front to back to optimize the z tests?

    edit: the first MVP of our game using GPUInstancer https://store.steampowered.com/app/1078000/RobocraftX/
     
    Last edited: May 10, 2019
  20. Master_Indie

    Master_Indie

    Joined:
    Feb 17, 2017
    Posts:
    10
    Hi, thanks for the asset, I'm having a problem though. I can't get LOD's to work.
    When I turn GPUInstancer off, then LOD's work just fine. But when I turn GPUI on, then every instance gets set to LOD0 no matter the distance.

    Edit: I'm using Unity 2018.3.12f1 by the way.
     
  21. Whitebrim

    Whitebrim

    Joined:
    Jul 11, 2017
    Posts:
    17
    Hello, I just tested Occlusion culling on GPU Instancer Prefab Manager on Runtime added objects and it isn't working. Is there any option to use Occlusion culling on dynamic objects (that will be generated after scene is loaded)? I'm doing a game, similar to minecraft.
     
  22. Whitebrim

    Whitebrim

    Joined:
    Jul 11, 2017
    Posts:
    17
  23. Rewaken

    Rewaken

    Joined:
    Mar 24, 2015
    Posts:
    128
    Hello, I am on 2019.1.0f2 when I added gpui editor becomes terribly slow. After profiling editor applicatiotickcomplition was taking almost 230ms every 2-3 seconds. Also I am building for android with lwrp.
     
  24. LouskRad

    LouskRad

    Joined:
    Feb 18, 2014
    Posts:
    904
    Hi there,

    The slowdown in multiple spotlights should not be related to GPU Instancing. That is, in forward rendering, each spot light would amount to an additional shader pass, causing a slowdown; but this slowdown would apply to both GPU Instancer on and off, and the instanced scene should again be faster.

    However, for example in cases that there is a shader error, this could effect GPUI more since it would be effected more from the errors in the GPU.

    If this is not the case, if you can recreate this problem in a sample project, we can investigate this issue. You can send the sample to support@gurbu.com

    This is the intended behavior, and the prefab importer is designed to add a new manager to the scene. The prefab importer checks the original prefab, but not the overrides in the scenes. It should detect the nested prefab instances correctly if they are the same with the original prefab, but the GameObject specific variations would not show. Is it possible that the fence object lost its prefeb reference while you were adding it to the building?
     
  25. LouskRad

    LouskRad

    Joined:
    Feb 18, 2014
    Posts:
    904
    Hi,

    1) it would: since the intel HD 4000 is not a dedicated GPU - It doesn't help optimizing CPU operations by executing them in the GPU when the GPU and the CPU share the same resources.
    2) GPU instanced objects are not sorted at all since they are treated as the same during the drawcall.

    Congratulations on the release of RobocraftX!
     
  26. LouskRad

    LouskRad

    Joined:
    Feb 18, 2014
    Posts:
    904
    Hi there,
    Are you using LOD Groups? We can help if you can send us the details of your LOD setup (renderers, transforms, child transforms, etc.).
     
  27. LouskRad

    LouskRad

    Joined:
    Feb 18, 2014
    Posts:
    904
    Hi there,

    GPUI's occlusion culling works only on the instances of objects that you define as prototypes to a manager, and it works dynamically on the runtime added instances as well. Also, the HiZ occlusion solution works by using the depth buffer and is limited by the precision of the depth texture. You can take a look at this wiki article for more information on how it works.

    For information on how you can add/remove instances at runtime, you can take a look at this document.
     
  28. Rewaken

    Rewaken

    Joined:
    Mar 24, 2015
    Posts:
    128
    No I am not using LOD groups all I did import GPUI from the store and then editor becomes extremely slow
     
  29. LouskRad

    LouskRad

    Joined:
    Feb 18, 2014
    Posts:
    904
    We could not recreate this problem in our tests in 2019.1.0f2. Can you send us a sample project that shows the problem or describe us how to recreate it?
     
  30. sebas77

    sebas77

    Joined:
    Nov 4, 2011
    Posts:
    1,643
    thank you, but about the point 2, doesn't this mean that there could be a massive fill rate waste? Or is it taken care of by the GPU occlusion?

    I have profiled the game even more. TBH I don't see much difference between the UECS pipeline and the GPUI pipeline for the shadowing atm, however with some specific meshes, the GPUI is making the GPU spike on the intel HD 4000. Would you like a project to investigate?
     
    Last edited: May 13, 2019
  31. Whitebrim

    Whitebrim

    Joined:
    Jul 11, 2017
    Posts:
    17
    As you can see Frustum Culling is perfectly working, but not Occlusion Culling
    Снимок.PNG
     
  32. Whitebrim

    Whitebrim

    Joined:
    Jul 11, 2017
    Posts:
    17
    upload_2019-5-13_21-14-24.png

    Asset version: 1.10
     
  33. lorddanger

    lorddanger

    Joined:
    Aug 8, 2015
    Posts:
    103
    Hello,
    I was testing GUP Instancer on a test project.
    for starting I create a terrain 500x500 and added a bunch of trees.

    Had the Batches of 2000 then I installed GUPI
    and got it set up and with GUPI enabled I am getting 4200 Batches.
    Disabling the GUPI Tree Manager Component on the hierarchy get the batches back to 2000

    Is there something I am missing?
     
  34. zoltanBorbas

    zoltanBorbas

    Joined:
    Nov 12, 2016
    Posts:
    83
    Hi There!

    I am attempting to modify a custom shader to get color variations as described in the wiki, can you please tell me which shader should i try to modify? The original or the one post fixed with GPUI? I am using the color variations scene to test my attempt for starter i just created a new sphere with a new material using the original custom shader then registered that prefab to the GPUI Prefab Manager and switched out the prefab of the color variations script. The odd thing is when i right clicked on the instanced prefab's material to show the shader it was still using the original shader even though the folder has the GPUI post fixed shader file.

    So to not waste my time could you confirm which shader file should i edit?

    Thanks very much!

    UPDATE:

    Never mind looks like modifying the original shader does it just fine and now i have faction colors :)
     
    Last edited: May 14, 2019
    LouskRad likes this.
  35. zoltanBorbas

    zoltanBorbas

    Joined:
    Nov 12, 2016
    Posts:
    83
    Hi There,

    In my game i disable the renderer of enemy units when they are out of sensor range (It is a Space RTS), i was wondering what is your recommended way of doing something similar to instanced prefabs?
     
  36. Bamfax

    Bamfax

    Joined:
    Jun 12, 2016
    Posts:
    52
    On building with v1.1.1 (2018.3.0f2) I am getting the error "Assets\GPUInstancer\Scripts\Core\Static\GPUInstancerUtility.cs(886,13): error CS0103: The name 'AssetDatabase' does not exist in the current context".
    I am instantiating GPUInstancerPrefabManager at Runtime. At startup, the scene comes up with no GPUInstancer objects (GPUInstancerPrefabManager or similar). So I am wondering a little bit what it tries to get from the AssetDatabase on scene start. Only thing there is at this time are yet uninstanced prefabs which are have GPUInstancerPrefab on them. On playing in the editor everything is fine.

    Would you have a fix or hint where to look? Or should I upgrade first?

    Thanks again.
     
    Last edited: May 14, 2019
  37. LouskRad

    LouskRad

    Joined:
    Feb 18, 2014
    Posts:
    904
    GPUI takes over the rendering pipeline up to the point where draw calls happen and mesh information is submitted to the shaders. The rest of drawing (e.g. how the fragment programs are executed or pixels are discarded) is handled by Unity and the graphic API. That is, GPUI does not do anything in terms of how pixel level drawing is done.

    Whether UECS or GPUI would be the ideal choice would be quite project specific, however as I mentioned in my previous reply, the spikes in intel HD 4000 are expected since it is not a dedicated GPU, since bottlenecks will happen between shared CPU and GPU resources.
     
  38. LouskRad

    LouskRad

    Joined:
    Feb 18, 2014
    Posts:
    904
    Hi there,

    First off, the screenshot shows that the Occlusion Culling feature is enabled in the global manager settings, but disabled for your prototype:

    upload_2019-5-13_21-14-24.png
    Other than that, as with most occlusion culling techniques, GPUI's hi-z solution is designed with balancing speed and accuracy in mind while never culling actually visible objects. What this means is that while making the visibility calculations fast, some objects that you would expect to be culled might not be culled (depending on the scene depth information, and the depth texture precision).
     
  39. LouskRad

    LouskRad

    Joined:
    Feb 18, 2014
    Posts:
    904
    Hi there.

    Something seems to be going wrong there, are you getting any errors in the console? Can you give us some info about the trees you are using on the terrain? How many tree prototypes are there, and do they have LODs, etc?
     
  40. LouskRad

    LouskRad

    Joined:
    Feb 18, 2014
    Posts:
    904
    Hi Zoltan,

    GPUI disables Mesh Renderers while working, so (if your sensor range is a diameter) an idea would be to use your sensor range as the max distance for culling your prototypes.
     
  41. LouskRad

    LouskRad

    Joined:
    Feb 18, 2014
    Posts:
    904
    Hi,

    we have fixed this bug in the v1.1.2 update, so upgrading GPUI to the latest version should fix it.
     
  42. lorddanger

    lorddanger

    Joined:
    Aug 8, 2015
    Posts:
    103
    Hi,
    Thank for the response,
    These are speed tree from Desktop tree pack,
    In the scene I am using about 5-6 type of tress, and 3-4 types of bushes) from Medow environment pack).
    As for how many are are added to the list of trees in terrain list or prototype its 14( 8 type of tree and 6 type of bushes)
     
  43. zoltanBorbas

    zoltanBorbas

    Joined:
    Nov 12, 2016
    Posts:
    83
    Hi LouskRad, thanks for the idea, however that would not work in a Real Time Strategy Game setting visibility of the enemy unit is not dependent on distance from camera it is dependent on distance from any friendly units. The camera is an RTS Camera with free movement over the entire scene. I need a way to tell an instance to turn invisible when out of sensora range of all of the frendly units under player control.

    Is there any API call that i could use to cull an instance?
     
  44. sebas77

    sebas77

    Joined:
    Nov 4, 2011
    Posts:
    1,643
    The spikes are weird though because they happen with very specific meshes.

    You didn't answer my question about the fill rate though, which is gpu unrelated. If you render everything with a draw call only, how do you manage the overdrawn? How does your hierarchical gpu occlusion culling fit in the pipeline?
     
    Last edited: May 15, 2019
  45. Rewaken

    Rewaken

    Joined:
    Mar 24, 2015
    Posts:
    128
    Hello, I tested again by creating a new project. I imported all assets I was using before and notice the same issue so to debug it I removed assets one by one. The slowdown in the editor was caused by having kripto's realistic effect v4 and GPUI and I need to remove anyone to fix the issue. It seems weird because two assets are not related by anyway.
     
  46. shamsfk

    shamsfk

    Joined:
    Nov 21, 2014
    Posts:
    307
    Hello! I have a problem with Tree Creator trees not being affected by the wind using GPUI.
    I'm on 2019.1, using GPUI with trees from https://assetstore.unity.com/packages/3d/vegetation/trees/trees-variety-72855
    It uses Tree Creator with standard tc textures, winzone is present and works outside og GPUI. Grass works fine and is affected by the wind, yet trees are not. Please help, what could I've being doing wrong here?
     
    Last edited: May 15, 2019
  47. zoltanBorbas

    zoltanBorbas

    Joined:
    Nov 12, 2016
    Posts:
    83
    Hi LouskRad!

    I have updated GPUI to the new version from 0.9.8 and now the custom shader with color variations no longer works. I do register my buffers just i did yesterday with V0.9.8:

    public GPUInstancerPrefabManager prefabManager;
    public List<GPUInstancerPrefab> prefabs;
    public string emissionColorBufferName = "emisionColorBuffer";
    public string fresnellColorBufferName = "fresnelColorBuffer";
    public string desolveBufferName = "desolveBuffer";
    public string glowSizeBufferName = "glowSizeBuffer";
    public string glowTilingBufferName = "glowTilingBuffer";
    public string useGlowOpacityBufferName = "useGlowOpacityBuffer";
    private void Start()
    {
    if (prefabManager != null && prefabManager.isActiveAndEnabled)
    {
    for (int i = 0; i < prefabs.Count; i++)
    {
    GPUInstancerAPI.DefinePrototypeVariationBuffer<Vector4>(prefabManager, prefabs.prefabPrototype, emissionColorBufferName);
    GPUInstancerAPI.DefinePrototypeVariationBuffer<Vector4>(prefabManager, prefabs.prefabPrototype, fresnellColorBufferName);

    GPUInstancerAPI.DefinePrototypeVariationBuffer<float>(prefabManager, prefabs.prefabPrototype, desolveBufferName);
    GPUInstancerAPI.DefinePrototypeVariationBuffer<float>(prefabManager, prefabs.prefabPrototype, glowSizeBufferName);
    GPUInstancerAPI.DefinePrototypeVariationBuffer<float>(prefabManager, prefabs.prefabPrototype, glowTilingBufferName);
    GPUInstancerAPI.DefinePrototypeVariationBuffer<float>(prefabManager, prefabs.prefabPrototype, useGlowOpacityBufferName);
    }
    }
    }

    And set colors as before, right after the prefabs been instantiated:

    GPUInstancerPrefab GPUIprefabInstance;
    public void SetColor(Player _plyr)
    {
    FindAllRenderers();
    for (int i = 0; i < colorRenderers.Count; i++)
    {
    GPUIprefabInstance = this.colorRenderers.gameObject.GetComponent<GPUInstancerPrefab>();

    if (this.GPUIprefabInstance != null)
    {
    GPUIprefabInstance.AddVariation(SceneTransitionManager.Instance.emissionColorBufferName, (Vector4)PlayersManager.Instance.GetPlayerColor(_plyr).ColorEmission);
    GPUIprefabInstance.AddVariation(SceneTransitionManager.Instance.fresnellColorBufferName, (Vector4)PlayersManager.Instance.GetPlayerColor(_plyr).ColorFresnel);
    GPUIprefabInstance.AddVariation(SceneTransitionManager.Instance.desolveBufferName, 0.6f);
    GPUIprefabInstance.AddVariation(SceneTransitionManager.Instance.glowSizeBufferName, this.minGlowSize);
    GPUIprefabInstance.AddVariation(SceneTransitionManager.Instance.glowTilingBufferName, this.glowTiling);
    GPUIprefabInstance.AddVariation(SceneTransitionManager.Instance.useGlowOpacityBufferName, 0.0f);
    }
    }
    }

    And here is the Shader Code that also had not changed since it worked with V0.9.8

    #if SHADER_API_D3D11
    #ifdef UNITY_PROCEDURAL_INSTANCING_ENABLED
    StructuredBuffer<float4> emisionColorBuffer;
    StructuredBuffer<float4> fresnelColorBuffer;
    StructuredBuffer<float> desolveBuffer;
    StructuredBuffer<float> glowSizeBuffer;
    StructuredBuffer<float> glowTilingBuffer;
    StructuredBuffer<float> useGlowOpacityBuffer;
    #endif
    #endif


    void surf( Input i , inout SurfaceOutput o )
    {
    float4 emisionCol = _EmissionColor;
    float4 fresnelCol = _FresnelColor;
    float desolve = _MaskAppearProgress;
    float glowSize = _GlowSize;
    float glowTiling = _GlowTiling;
    float useGlowOpacity = _UseGlowOpacity;

    #if SHADER_API_D3D11
    #ifdef UNITY_PROCEDURAL_INSTANCING_ENABLED
    uint index = gpuiTransformationMatrix[unity_InstanceID];
    emisionCol = emisionColorBuffer[index];
    fresnelCol = fresnelColorBuffer[index];
    desolve = desolveBuffer[index];
    glowSize = glowSizeBuffer[index];
    glowTiling = glowTilingBuffer[index];
    useGlowOpacity = useGlowOpacityBuffer[index];
    #endif
    #endif

    the rest of the shader code is not mine to share freely but obviously the local variables are plugged into the right places and since this has had perfectly worked while using V0.9.8 i assume that its correct.
    }

    I cannot figure out what has changed, from hours of trying all kinds of things, like starting the shader from scratch, manually calling the the above SetColor subroutine and stepping through code it just doesn't want to work.

    I learned one thing though that shader does not skips over the part where reads the buffers, however whatever it reads is not the value that is set in the SetColor subroutine rather as if the buffer values were 0; the color is black the glow size is 0 etc. Stepping through the SetColor subroutine code showed that all the right values are set as shown bellow:

    upload_2019-5-15_13-9-17.png

    I tested the color variations demo which works just fine, i just do not get it why my custom shader would not work as before. I so regret upgrading!!!!

    UPDATE:

    Auto Add/Remove instance no longer initializes the buffer which is in case of the color variations demo was done manually hence why that worked. In V0.9.8 the hole thing was done automatically and that is why i did not copied the code following code from the variation demo script into my solution;

    // Register the generated instances to the manager and initialize the manager.
    if (prefabManager != null && prefabManager.isActiveAndEnabled)
    {
    GPUInstancerAPI.RegisterPrefabInstanceList(prefabManager, goList);
    GPUInstancerAPI.InitializeGPUInstancer(prefabManager);
    }
    Could you please help me with restoring the automated buffer initializing feature i would prefer not to have to manually register and deregister instances.

    Beside being lazy i keep getting this error if any prototype in the manager has a 0 instance count when manually calling RegisterPrefabInstanceList/InitializeGPUInstancer after spawning a prefab instance at runtime.

    ArgumentException: Attempting to create a zero length compute buffer
    Parameter name: count
    UnityEngine.ComputeBuffer..ctor (System.Int32 count, System.Int32 stride, UnityEngine.ComputeBufferType type, System.Int32 stackDepth) (at C:/buildslave/unity/build/artifacts/generated/bindings_old/common/Core/ComputeShaderBindings.gen.cs:78)
    UnityEngine.ComputeBuffer..ctor (System.Int32 count, System.Int32 stride) (at C:/buildslave/unity/build/artifacts/generated/bindings_old/common/Core/ComputeShaderBindings.gen.cs:64)
    GPUInstancer.PrefabVariationData`1[T].InitializeBufferAndArray (System.Int32 count, System.Boolean setDefaults) (at Assets/GPUInstancer/Scripts/GPUInstancerPrefabManager.cs:1175)

    All the prefabs uses the same material and upon all DefinePrototypeVariationBuffer() gets called at the start of the game as shown above but not all gets an instance spawned before calling RegisterPrefabInstanceList/InitializeGPUInstancer manually.

    I was able to circumvent the error by adding a break out condition but i am sure this is not the right way to go about it.

    public void InitializeBufferAndArray(int count, bool setDefaults = true)
    {
    if (count == 0)
    return;

    dataArray = new T[count];
    if (setDefaults)
    {
    for (int i = 0; i < count; i++)
    {
    dataArray = defaultValue;
    }
    }
    if (variationBuffer != null)
    variationBuffer.Release();
    variationBuffer = new ComputeBuffer(count, System.Runtime.InteropServices.Marshal.SizeOf(typeof(T)));
    }

    Thanks in advance!
     
    Last edited: May 15, 2019
  48. zoltanBorbas

    zoltanBorbas

    Joined:
    Nov 12, 2016
    Posts:
    83
    Hi LouskRad!

    I do apologize for the many question i have, but this project is very important for me! I have 2 questions and to make it easier to understand i have made a short clip to showcase them. I have annotated it with the questions and my observations, do forgive my english it is not my first language. :(

    https://1drv.ms/v/s!AqDKYfErvs5phNZWS3PTaBPLTW725g

    Please let me know if you need more information to be able to help!!!

    Thanks in advance!
     
  49. Bamfax

    Bamfax

    Joined:
    Jun 12, 2016
    Posts:
    52
    thanks again, that did it.

    One further thing which I ran into afterwards. While it runs fine in the editor, in standalone upon GPUInstancer.GPUInstancerAPI.InitializeGPUInstancer it throws a ArgumentNullException:

    Code (CSharp):
    1.  
    2. ArgumentNullException: Value cannot be null.
    3. Parameter name: shader
    4.   at (wrapper managed-to-native) UnityEngine.Material.CreateWithShader(UnityEngine.Material,UnityEngine.Shader)
    5.   at UnityEngine.Material..ctor (UnityEngine.Shader shader) [0x00007] in <9b3b6573bfb64614aa7ee01c0905dc79>:0
    6.   at GPUInstancer.GPUInstancerShaderBindings.GetInstancedMaterial (UnityEngine.Material originalMaterial, System.String extensionCode) [0x00077] in \Assets\GPUInstancer\Scripts\Core\DataModel\GPUInstancerShaderBindings.cs:130
    7.   at GPUInstancer.GPUInstancerRuntimeData.CreateRenderersFromMeshRenderers (System.Int32 lod, GPUInstancer.GPUInstancerPrototype prototype) [0x0013a] in \Assets\GPUInstancer\Scripts\Core\DataModel\GPUInstancerRuntimeData.cs:372
    8.   at GPUInstancer.GPUInstancerRuntimeData.CreateRenderersFromGameObject (GPUInstancer.GPUInstancerPrototype prototype) [0x000aa] in \Assets\GPUInstancer\Scripts\Core\DataModel\GPUInstancerRuntimeData.cs:246
    9.   at GPUInstancer.GPUInstancerPrefabManager.InitializeRuntimeDataForPrefabPrototype (GPUInstancer.GPUInstancerPrefabPrototype p, System.Int32 additionalBufferSize) [0x0001d] in \Assets\GPUInstancer\Scripts\GPUInstancerPrefabManager.cs:237
    10.   at GPUInstancer.GPUInstancerPrefabManager.InitializeRuntimeDataRegisteredPrefabs (System.Int32 additionalBufferSize) [0x0005c] in \Assets\GPUInstancer\Scripts\GPUInstancerPrefabManager.cs:227
    11.   at GPUInstancer.GPUInstancerPrefabManager.InitializeRuntimeDataAndBuffers (System.Boolean forceNew) [0x001a6] in \Assets\GPUInstancer\Scripts\GPUInstancerPrefabManager.cs:183
    12.   at GPUInstancer.GPUInstancerAPI.InitializeGPUInstancer (GPUInstancer.GPUInstancerManager manager, System.Boolean forceNew) [0x00001] in \Assets\GPUInstancer\Scripts\API\GPUInstancerAPI.cs:23
    13.   ...
    Stepping that, this function (in line 98) seems to return null, even though _standardUnityShaders and shaderName seem to be filled correctly:
    Code (CSharp):
    1. GetInstancedShader(string shaderName, string extensionCode = null)
    2. ...
    3. if ...
    4.    return Shader.Find(_standardUnityShadersGPUI[_standardUnityShaders.IndexOf(shaderName)]);
    Used Shader is standard specular.

    Edit:
    Shader.Find("GPUInstancer/Standard (Specular setup)") returns null in standalone.
    In Editor it works fine, "(UnityEngine.Shader)GPUInstancer/Standard (Specular setup)" is returned as expected.
     
    Last edited: May 15, 2019
  50. LouskRad

    LouskRad

    Joined:
    Feb 18, 2014
    Posts:
    904
    Hi there,

    You can calculate how many batches GPUI uses for your prototypes by this formula:

    (Total number of submeshes in LOD renderers) * (shadow cascade count + 1)

    So for example a tree in the Meadow Environment pack has 4 LOD levels. 3 of them have 2 submeshes, and 1 of them has 1 submesh - which amounts to a total of 7 submeshes. If you have 4 shadow cascades in your quality settings, this prototype would amount to (7 * (4 + 1)) = 35 batches. This number would be the same for any number of instances.

    If you have 14 prototypes like this, GPUI should be rendering your trees in 35 * 14 = 490 batches. This is of course only for the prototypes rendered by GPUI. If you have screen space effects, reflections, etc. in your scene, these might also increase the batches.

    If you can try your terrain on an empty scene, without any effects, the batches you see should be around these numbers (with the addition of the skybox, terrain, etc.). If you still see unexpected batch counts, please check to see if you have errors in the console.