A Unity ID allows you to buy and/or subscribe to Unity products and services, shop in the Asset Store and participate
in the Unity community.
Separate names with a comma.
Discussion in 'Assets and Asset Store' started by LouskRad, May 3, 2018.
GPUI is meant to increase the FPS. You should be able to fix that by enabling GPUI
Hello, I'm using the NoGameObject way , it's good but it does that ( in GPUInstancerAPI.InitializeWithMatrix4x4Array )
// release previously initialized buffers
// initialize buffers
- my use case is new procedural terrain chunks getting added and removed (think minecraft)
- so currently, I just redo the whole matrix4x4 array ( one per prototype) when I want to add an additional set of matrix4x4. Is there a way (an api) to do a partial update of the matrix4x4 array ? Or it doesn't make sense and I actually should discard and redo it all like I currently do ?
- I've found a way to remove the ones I don't need anymore: _gpuInstancerManager.RemoveInstancesInsideBounds(bounds, 0);
Is it efficient ? Is there a better way ?
I just installed v0.9.7 with UNITY 2018.2.14f1 and I still have only about 90FPS as you can see here: https://ibb.co/NsKRFM2 here is another screenshot of my GPUI settings: https://ibb.co/MCq4fdw - any idea?
You can use the UpdateVisibilityBufferWithMatrix4x4Array method to do a partial update. You can not change array sizes while using this, but if you zero-out the matrices for the parts of the array, the corresponding instances will effectively be culled. If you use a large enough array for this while initializing and leave the unused indexes as zero, you can update without re-initialization.
RemoveInstancesInsideBounds is designed to be efficient and does the operation directly at GPU. However, if what you need to do is to update the instance data, it would be more effective to use the update array strategy.
btw I also have this error in the console:
NullReferenceException: Object reference not set to an instance of an object
GPUInstancer.GPUInstancerManager.Awake () (at Assets/GPUInstancer/Scripts/Core/Contract/GPUInstancerManager.cs:96)
GPUInstancer.GPUInstancerPrefabManager.Awake () (at Assets/GPUInstancer/Scripts/GPUInstancerPrefabManager.cs:28)
You can be limiting your scene with V-Sync. Can you please check
Edit -> Project Settings -> Quality
and make sure that V-Sync is turned off:
If you're getting this error, it means GPUI is not running. Did you move any file? You seem to be getting the error because GPUI can't find:
If these don't solve the issue, please mail us at firstname.lastname@example.org with a screenshot of the folder structure of GPUI, a profiler screen shot while you are running the scene and any other warnings/errors you might be getting.
The 0.9.7 version update is live on the Asset Store. In this update, we have made various bug fixes and introduced a number of new features:
First off, we have added a new demo scene that is intended to showcase how you can use custom Compute Shaders to extend the GPUI core system:
This is an advanced feature, and you can use the scripts in this scene as a guideline to implement your own Compute Shaders.
Next, you can now define custom layers that you wish to use for the texture detail prototypes in the Detail Manager. The managers will still use the layers that are defined for the prefabs, that is unchanged.
On this point, you can also use layers to increase performance by ignoring them from unnecessary components. An example is, if you are using projectors for water caustics (such as AQUAS does), they can hit your performance if you are not ignoring the layer that GPUI uses. For more information on this, please take a look at:
Moving on, we have also added a new option to the managers to force GPUI to Render with the Selected Camera only. Using this option will increase performance in your scenes if you have additional cameras that you do not wish to use for scene rendering - including those for weather effects, post processing, or water shaders, etc.
Another new option is that you can now use frustum and occlusion culling for shadows. This will improve shadow performance, but depending on your scenes' camera angles, it can also result in slightly more inconsistent shadows. This will be a better option, for example, for top-down camera angles.
Here is the full change log:
New: Added Render With Selected Camera option to limit rendering to one camera
New: Added Detail Texture Layer option to Detail Manager
New: Added support for the internal "Standard (Roughness setup)" shader
New: Added option to use frustum and occlusion culling for shadows
New: Added UpdateVariation API method to update variation buffer at runtime
New: Added a new demo scene that showcases the use of custom compute shaders to extend GPUI
Changed: Managers now show the Unity Archive link to download built-in shaders, if necessary
Changed: Improved the CPU times when handling transform changes
Fixed: Billboards not using the original prefab's Layer
Fixed: SpeedTrees without BillboardRenderer not showing billboard distance options
Fixed: Standard shader error for Unity 2017.3 versions
Fixed: Billboard renderer causing a minor artifact in forward rendering and single-pass VR when there is a CommandBuffer using the depth texture
Fixed: A bug that would prevent builds from Unity 2018.3 beta versions
Sounds good. Do you have any idea for a workaround meanwhile?
I just tested GPUI within my project (went through your getting started guide) https://www.gurbu.com/gettingstarted (added a p
thanks for the hint. I was able to increase the FPS in the asteroid scene by turning vsync off, however I haven´t been so lucky by integrating GPUI to my project. I am using this asset: https://assetstore.unity.com/packages/tools/modeling/runtime-level-design-52325 which allows me to place/edit prefabs at runtime. I assigned my prefabs as prototyps in GPUI Manager and activated frustum culling. Unfortunately if I position the scene-window next to the game window I don´t notice that frustum culling is active (in my scene-Window I have all objects constantly showing) - is it possible that GPUI isn´t compatible with RLD or am I missing something here?
Thanks for your help so far
Thank you for your answer.
What seems weird is that the draw order seems different at each frame. even if the camera is not moving, the grass is blinking. Is there a way to make it consistent between frames?
Are there plan for an LWRP version of the foliage shader?
Any suggestion on how to implement the OIT?
Any advice on how to workaround the issue?
Thank you for your help!
A suggestion : could we get a specialized detail manager for custom (mesh based) terrain ?
I'm pretty sure you ve got some tricks and optimizations for the vegetation/grass use case that would apply on custom mesh terrains !
+Something unrelated :
I think I've found a very efficient trick that may profit others: store the ID of the prototype in the matrix m33.
@LouskRad I noticed that in this update, you can specify to only render with a certain camera. Can you also specify to render with certain cameras (plural) but not others for things like split screen? And can you add and remove cameras dynamically at run time?
@LouskRad Getting this error, any Idea?
NullReferenceException: Object reference not set to an instance of an object
GPUInstancer.GPUInstancerRuntimeData.<GenerateLODsFromLODGroup>m__0 (UnityEngine.Renderer r) (at Assets/GPUInstancer/Scripts/Core/DataModel/GPUInstancerRuntimeData.cs:243)
System.Collections.Generic.List`1[UnityEngine.Renderer].AddEnumerable (IEnumerable`1 enumerable) (at /Users/builduser/buildslave/mono/build/mcs/class/corlib/System.Collections.Generic/List.cs:128)
System.Collections.Generic.List`1[UnityEngine.Renderer]..ctor (IEnumerable`1 collection) (at /Users/builduser/buildslave/mono/build/mcs/class/corlib/System.Collections.Generic/List.cs:65)
System.Linq.Enumerable.ToList[Renderer] (IEnumerable`1 source)
GPUInstancer.GPUInstancerRuntimeData.GenerateLODsFromLODGroup (UnityEngine.LODGroup lodGroup, GPUInstancer.GPUInstancerShaderBindings shaderBindings, Boolean createShadowMPB) (at Assets/GPUInstancer/Scripts/Core/DataModel/GPUInstancerRuntimeData.cs:243)
GPUInstancer.GPUInstancerRuntimeData.CreateRenderersFromGameObject (GPUInstancer.GPUInstancerPrototype prototype, GPUInstancer.GPUInstancerShaderBindings shaderBindings) (at Assets/GPUInstancer/Scripts/Core/DataModel/GPUInstancerRuntimeData.cs:220)
GPUInstancer.GPUInstancerPrefabManager.InitializeRuntimeDataForPrefabPrototype (GPUInstancer.GPUInstancerPrefabPrototype p, Int32 additionalBufferSize) (at Assets/GPUInstancer/Scripts/GPUInstancerPrefabManager.cs:359)
GPUInstancer.GPUInstancerPrefabManager.InitializeRuntimeDataRegisteredPrefabs (Int32 additionalBufferSize) (at Assets/GPUInstancer/Scripts/GPUInstancerPrefabManager.cs:349)
GPUInstancer.GPUInstancerPrefabManager.InitializeRuntimeDataAndBuffers (Boolean forceNew) (at Assets/GPUInstancer/Scripts/GPUInstancerPrefabManager.cs:311)
GPUInstancer.GPUInstancerManager.OnEnable () (at Assets/GPUInstancer/Scripts/Core/Contract/GPUInstancerManager.cs:142)
Well, if you are using pinned terrains, you can manually add managers on them without using the MapMagic Integration interface.
We haven't tested GPUI with that asset, but if it distributes prefab instances, it should work with GPUI.
We don't plan to introduce a LWRP version of the foliage shader in the near future, but you can use any LWRP grass shader by using custom material in the manager.
Transparent, tessellated or geometry shaders would not work; but they also wouldn't be recommended for grass because of performance reasons.
We are working on improvements for multiple camera scenarios. We'll keep you updated.
Looks like you get the error from a prototype with an LOD Group on it. Do you have an LOD renderer on this without a mesh renderer, mesh filter or mesh?
I'm having a tough time deciding on whether I should buy this or not. It looks amazing for terrains with grass and I congratulate you on making such an efficient tool, but will this asset help me boost my project's fps if I use vegetation studio to spawn my grass/trees/objects on a mesh terrain?
Also, do you have any planned release date for your impostors addition to this asset?
Another simple idea to prevent culling mesh renderer > shadow casters. Not sure if that would work for your system.
Think about an basic building containing perhaps 2 materials and baked of 3 meshes.
This building has eg. 300 prefabs attached that make the building look HD.
Now we can be assured, if the building is in frustum, all other 300 HD prefabs are as well. (overdraw ignored)
So there is not need to test the 300 prefabs for culling.
What I have in mind is, create a scene GO containing the main building, then drop all prefabs as child into and define them as cull=based on parent. and cast shadows true. Would that be suitable?
Hi there, and thanks.
GPUI already includes a billboard system, and you can use custom impostors with this system as well. If you mean Amplify Impostors, you can currently use them with GPUI as well. You can take a look at this post to see how:
GPUI and Vegetation Studio will not work for the same objects. For more information, you can take a look at this post:
GPUI does all the culling operations in the GPU, and the tests are based on the instance matrices rather then their GO hierarchies. A GO based testing would not be ideal as a generic solution - implementing this in such a way that it works in the GPU would have its own issues that would cause overhang.
new demo is great but as shown in screenshot above, is there any way to give each instance a working collider?
Hi, I'm using unity standard specular shader for my corals, but found the cutout effect is completely gone in a build game:
They all look normal in editor.
I'm using unity 2018.2.16f1.
This issue is only found on build, not in editor.
Any work arounds or fix we can use?
Edit: Also the normal map and detail map is also wrong compare to the real version:
That demo is intended to show how you can implement your own Compute Shaders to GPUI. The example compute shader in the demo only creates boids behavior for the instances, and moves them in this manner. The movement is created directly in the Compute Shader. For the instances to detect collision, you need to implement that to the movement logic of the Compute Shader that you will be using.
Also, that example scene does not instantiate any GameObjects and uses GPUI with matrix4x4s; and you can't add Colliders to these instances as they are in the demo.
You can solve this issue by changing the material on your prefab to use "GPUInstancer/Standard (Specular setup)" instead of the "Standard (Specular setup)".
The issue there is related to the way Unity handles shader variants. GPUI uses it's own version of the Standard Specular Shader, and switches to this shader at play mode. This shader is in the /resources folder, so it enters the build. However, since you are using the the cutout variant of the original shader, and Unity does not see the GPUI version of the variant on the prefab, it ignores this variant for the GPUI version during build time.
Thanks for the solution!
And about UBER compatibilities, I found the models used UBER shader also got the same issue.
Is there a GPUI version of the UBER shaders or they are not supported right now?
Oh I like this version a lot more than the real version. The real version looks too overdozed.
We haven't tested GPUI with the UBER shader asset. However, if you see that UBER works with GPUI before the build, than you can use the same solution and use the converted shader on the prefab (that starts with GPUInstancer/) instead of the original one.
GPUI auto-converts and creates a copy of the original shaders. These shaders always start with "GPUInstancer/" followed by the original shader name.
Great, I'll try that! Thank you very much!
Seems we need a new detail texture for the basalt rock
Hello, After adding GPUI prefab manager in my scene batch count increased to 1625 from 300 or so. I noticed that GPUI disables gpu instancing on custom shaders which are used by scene prefabs mainly 3dforge building prefabs and speedtrees etc.
Edit: After further test I notice that though gpui decreases tris count but as it increases draw calls.
Edit2: Okay, So my scene setup is I divided terrain into chunks of terrain and each terrain has trees and grass on it. SO GPUI generates detail instancing and tree instancing for every terrain, Due to that draw calls increased to 1600 as every instance of tree manager generates approx 125 draw calls
Edit3: Sorry it was my fault as mentioned above due to number of terrains and detail manager generated draw calls are increased. Also when I am using 3dforge assets using with gpui prefab manager doubles its draw calls but reduces tris count etc.
Edi4: Okay about 3dforge content as every prefab contains multiple submeshes I think GPUI generates 1 draw call for each prefab due to which draw calls are getting increased
Just reporting some frictions I had using latest GPU Instancer with latest LWRP (4.3.0, on Unity 2018.3.0b12), I hope they will be fixed in future.
I followed the getting started guide and the generated replacement shader has these errors:
Compare the include rules between GPU Instancer's shader and Custom shader code from Shader Graph, I can see the include path are incorrect, so I fixed them manually.
BUT the mesh and shadow are still missing, even though they are shown with "Instancing active with ID: XXX" etc. in play mode. I can't see any trace of these objects in Frame Debugger, even though they present in scene hierarchy.
I then tried all the included demo and I can't get any of them to render in play mode: some of them did cast shadow but none of them render mesh and material.
Until I realize GPU Instancer might have trouble figure out which is the main camera (even though I have only 1 camera in scene): I am not sure this has anything to do with LWRP is using a custom camera editor, but I fixed this issue by selecting camera manually.
So color me a bit confused as new user, hope this helps others.
Also a quick question: I never used DrawMeshInstancedIndirect directly, is it normal for all object rendered by it to "disappear" when you pause in play mode? It appears to the case with GPU Instancer with Unity 2018.3.0b12, I can "step" to the next frame and they appear again...
(EDIT: the meshes also disappear from game view and frame debugger, when the scene view is visible... )
Thanks @LouskRad !/
yes We found some prefab error! but we still getting tons and tons of batches.. maybe we need a different Approach to the a small garden we are making...
Increasing batch counts usually means that there are prototypes with very few instance counts. Most commonly made mistake is to render everything with GPUI even while some prefabs have very few instance counts. GPUI works best with prefabs that have high instance counts.
We are in the process of preparing a Best Practices documentation to explain how to get the most out of GPUI.
But the main part to keep in mind is to have only the prefabs that have high instance counts rendering with GPUI, and to not to create multiple prototypes for prefabs that have the same mesh/materails.
Latest LWRP/HDRP version we have tested was 3.0.0. We will make the necessary changes for 4.3.0 in the next update.
GPUI currently does not make rendering calls while the editor is paused. Editor's behaviour changes with different Unity versions, so you might see different results when editor is paused.
We are working on a system that can continue rendering in pause mode, and also can work with Scene View camera.
Hi, i got some serious problems getting startet with unity 2017.2.0f3. No Asteroids and very low fps after import.
But first i got an error with the speedtree part [Cannot implicitly convert type `UnityEngine.LOD' to `LOD' ]
and after leaving it aside i got this error on the picture below. Any ideas how to fix it? Thank you in advance
To fix the problem in the screenshot you can make the following fix:
Change GPUIStandardInclude.cginc Line 51
#if UNITY_VERSION < 201714 || UNITY_VERSION == 201730 || UNITY_VERSION == 201731
#if UNITY_VERSION < 201714 || UNITY_VERSION == 201730 || UNITY_VERSION == 201731 || UNITY_VERSION == 201720 || UNITY_VERSION == 201721 || UNITY_VERSION == 201722
Looks like Unity developers couldn't decide if they should use half or float, they kept changing it on each Unity version
We will add this fix to the next update.
About the LOD error, you probably have another script that overrides Unity's LOD class. It would be better to put it under a namespace so that it doesn't conflict with the default LOD class. Like:
public class LOD
Thx, scene view render isn't a deal breaker for me, but it's strange that as long as scene view is visible, the game view render also stop working, I guess there are some underlying renderer sharing I am not aware of. (EDIT: I double-check and it seems this wasn't the case, I was using Frame Debugger and GPUI behaves strangely in game view when toggling it on/off)
And I realize the auto select camera issue is that somehow my camera object is not tagged with MainCamera.
It would be great if there is an easy way to switch material shader from Standard to LWRP, as Unity's own shader upgrader can't "upgrade" them to LWRP.
And if it's not too much to ask, perhaps you can move the GPU Instancer menu item under Tools menu too
Would this work alright with something like World Streamer?
What kind of modifications would need to be made to have this work with streaming loads across large worlds with multiple additive async scenes/terrains/objects?
I managed to reproduce my issue again, it doesn't always happen, so it's driving me a bit nuts.
Simply switching tab between profiler and scene view while in play mode trigger this issue, I believe the visibility of scene view window is causing this, but have no idea why.
(using Unity 2018.3.0b12 and LWRP 4.3.0, with latest GPUI; if I close the scene view and re-open it again then the issue is gone, crazy.)
Thank you it works
I have another question. I have a really huge scene. Is there a way to increase the max view distance of the billboards?
Thank you in advance
Sounds practical. We will do this in the next update.
We have not tested GPUI with World Streamer, but we have received feedback from some users that use GPUI with World Streamer.
GPUI supports adding, removing and updating the transforms of instances at runtime, so if you use a streaming asset that simply instantiates and destroys/disables GameObjects while streaming your game world, there is a good chance they can work together with GPUI.
GPUI also works with additive scenes so loading a scene at runtime effectively also means that the prefab instances are instantiated.
Looks like a Unity beta glitch, but otherwise I've got nothing as well
Well, the max billboard distance is tied to the Max Tree Distance property of the manager if you are using the Tree Manager - or if you are using the Prefab Manager, it is tied to the Max Distance property of the prototype. The Tree Manager limits you to 2500 where the Prefab Manager limits to 10000.
If you need more visibility distance, you can edit the limitations of the corresponding managers from the following lines in the GPUInstancerEditorConstants.cs
[line: 271] public static readonly float MAX_TREE_DISTANCE = 2500;
[line: 273] public static readonly float MAX_PREFAB_DISTANCE = 10000;
Using this with SteamVR and ~2500 trees in my scene. Getting phenomenal performance increases except when Amplify Occlusion (ambient occlusion solution) is enabled. Any idea on why or what we can do to get both AO and GPUI workin together?
EDIT: Narrowed it down to Temporal Filtering, which is needed to make AO look good in both eyes
There´s a minimum instances to put in GPUI, like starting in 10, 20, 50, 100...
Thank you !! Thats what i want Looks very nice now