Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Is it better to split VFX graphs into multiple instances?

Discussion in 'Visual Effect Graph' started by Qriva, Jan 26, 2021.

  1. Qriva

    Qriva

    Joined:
    Jun 30, 2019
    Posts:
    1,296
    I cannot find clear answer to my question anywhere.
    There are 10 different independent effects triggered by their own 10 events (OnPlay1, OnPlay2...). Is there any reason (like performence difference) to put them all into one large graph instead of 10 graphs in different game objects?
     
  2. Qriva

    Qriva

    Joined:
    Jun 30, 2019
    Posts:
    1,296
    @VladVNeykov Sorry to interrupt you, but today I had debate about VFX organisation in our project and this question came back to us again and it would be very helpful to know the answer.

    We want to make many swords in our game (60+) and each of them has it's own attack combination, like slash down, slash left, stab and other moves. Each move requires some VFX, but because there are different effects it is hard to find common interface to all of them. Currently we use switch node to swap mesh, rotate particle or anything else.

    However this pipeline is not efficient, so we want to make graphs with parameters for each move (stab vfx, slash vfx...) and reuse them, but there are two approaches we can take:
    1. Create Graph assets for each move, then add desired vfx as separate Visual Effect component, so if sword has 5 attacks, there would be 5 separate gameobjects with visual effects.
    2. Create Subgraphs for moves and each weapon gets own graph with desired subgraphs and they are triggered with different events. If sword has 5 attacks, there would be 5 subgraphs inside SwordXYZ graph.
    My question is: Is there significant difference in performance, memory, rendering, or anything that could cause trouble in the future?
     
  3. koirat

    koirat

    Joined:
    Jul 7, 2012
    Posts:
    2,068
    I don't know about performance, but what I think I know is problem with vfx sorting:

    If you got multiple effects in one graph you can modify Output Render Order.
    When using multiple GameObjects sorting will be based on distance from camera for each object.
     
    kizaru00 likes this.
  4. Qriva

    Qriva

    Joined:
    Jun 30, 2019
    Posts:
    1,296
    Yes, this might cause some problems, but in my specyfic case it might not.
    I see here there is some overhead for having multiple the same instances, so it looks like it is probably better to have one graph with subgraphs inside. The only thing I don't like that much is that there must be separate graph for almost each weapon, but it's still better than what we have now.
     
  5. VladVNeykov

    VladVNeykov

    Unity Technologies

    Joined:
    Sep 16, 2016
    Posts:
    550
    Hi @Qriva , yup, there's currently an overhead per VFX components, so the second options sounds more performant (of course benchmarking it for your case would give the best answer).

    You mentioned you are using switch operators for meshes (I assume Output Meshes); does that work for you? I think the limit is 32 switch cases, so you probably have to branch between two switch operators for all the 60+ swords?

    Do you have other specific parts of your effects (trails, sparks, etc.) which are particularly hard to find a common interface for?

    Also it sounds like you have different swords and unique VFX for different animations. How different are the effects for two swords for the same animation?
     
  6. Qriva

    Qriva

    Joined:
    Jun 30, 2019
    Posts:
    1,296
    In single graph there are around 3-6 switch nodes for different properties like mesh, color, lifetime... and there are separate graphs for swords. Actually good to know there is some hard limit, is it set per graph or system?

    About 'common interface' - I think this is not VFX fault, this is logistics, for example:
    there is the same slash animation used in two different weapons, but we might want to use different VFX effect and one of them might require two colors and one texture, while the other one needs only one color, noise texture and noise scale.
    If I wanted to make one (or several) large master graphs, then these params would need to be set from the start (not dynamically), if I wanted to make first slash pink and second slash blue, then I must know there are such parameters in C#, but because these params might differ it becomes overcomplicated or messy.

    But. Even if I there was the way to do this, I still need multiple outputs, because mesh is used by all of them, so currently if previous slash particle is alive and mesh is swapped, then both of them are rendered in the same way.
    Additionaly there is no way to disable output, so if I wanted to use single event then I would need to make switch controlling amount of particles emitted by each system.

    Anyway thank you for the answer, I guess second option is the way to do it, but if it would help you in some way, I will attach image of one of our first prototype graphs.
     

    Attached Files:

  7. VladVNeykov

    VladVNeykov

    Unity Technologies

    Joined:
    Sep 16, 2016
    Posts:
    550
    No, just per switch operator.


    Thanks! Apologies if you've already thought of this, but are you activating the effect through an event via script? If so, lots of the things you are setting can be set directly via script to avoid you having to do the many switches in your graph, something like this (untested code :p ) :
    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4. using UnityEngine.VFX;
    5.  
    6. public class VFXEventAttributeExample : MonoBehaviour
    7. {
    8.     VisualEffect visualEffect;
    9.     VFXEventAttribute eventAttribute;
    10.     private int weaponID;
    11.  
    12.     static readonly int eventID = Shader.PropertyToID("SwordAttack");
    13.     static readonly int positionID = Shader.PropertyToID("position");
    14.     static readonly int colorID = Shader.PropertyToID("color");
    15.  
    16.     void Start()
    17.     {
    18.         visualEffect = GetComponent<VisualEffect>();    
    19.         eventAttribute = visualEffect.CreateVFXEventAttribute();
    20.     }
    21.  
    22.     void StartTrail()
    23.     {
    24.         // Set attributes for each weapon
    25.         switch (weaponID)
    26.         {
    27.             case 0:
    28.                 eventAttribute.SetVector3(positionID, new Vector3(0f, 0f, 0f));
    29.                 eventAttribute.SetVector3(colorID, new Vector3(1f, 0f, 0f));
    30.                 break;
    31.  
    32.             case 1:
    33.                 eventAttribute.SetVector3(positionID, new Vector3(0f, 1f, 0f));
    34.                 eventAttribute.SetVector3(colorID, new Vector3(0.5f, 1f, 0.5f));
    35.                 break;
    36.         }
    37.    
    38.         // Sends the event with the attributes over to be inherited
    39.         visualEffect.SendEvent(eventID, eventAttribute);
    40.     }
    41. }
    Then in your VFX you inherit them without the need for using Switches:


    Kinda, but if you just need to toggle between a few outputs and there are not a humongous amount of particles, you can do something like this:

    (turning off particles per output, only 1 output will show at a time)

    If your output(s) are outputting lots of vertices, you can make this more efficient by enabling Compute Culling in the inspector when selecting the desired output:


    I noticed in your screenshot you are setting Age manually in Initialize to 0, probably left from prototyping, but don't think it's necessary.


    Anywho, not sure if any of this would be useful, but hopefully some food for thought :)
     

    Attached Files:

    shibi2017, laurentlavigne and Qriva like this.
  8. Qriva

    Qriva

    Joined:
    Jun 30, 2019
    Posts:
    1,296
    Yes, when we made this graph (I think few months ago) we didn't know this is possible, but you might see the actual problem here, there must be some place where we store data about angle or lifetime, it can be stored either in graph or C# script, but because event is limited only to common attributes, it made more sense to keep whole logic inside the graph.

    This is nice trick! Kind of dumb that I didn't think about this - previously I would use set size or something.
    I need to update koirat's thread.
     
    Last edited: Apr 18, 2021
  9. VladVNeykov

    VladVNeykov

    Unity Technologies

    Joined:
    Sep 16, 2016
    Posts:
    550
    If you need it to be asset-driven and easily editable, you can maybe store it in a scriptable object:
    Code (CSharp):
    1. using UnityEngine;
    2.  
    3. [CreateAssetMenu(fileName = "Data", menuName = "ScriptableObjects/SwordData", order = 1)]
    4. public class SwordData : ScriptableObject
    5. {
    6.     public string swordName;
    7.     public Vector3 position;
    8.     public Vector3 color;
    9. }
    This will allow you to have data assets in your project folder which will be easy to add/tweak directly in the inspector. And then in your C#
    Code (CSharp):
    1. public class VFXEventAttributeExample : MonoBehaviour
    2. {
    3.     VisualEffect visualEffect;
    4.     VFXEventAttribute eventAttribute;
    5.     static readonly int eventID = Shader.PropertyToID("SwordAttack");
    6.     static readonly int positionID = Shader.PropertyToID("position");
    7.     static readonly int colorID = Shader.PropertyToID("color");
    8.     public SwordData[] allSwordData; // <<<<< Link the scriptable objects here
    9.  
    10. ....
    11.  
    12.     void StartTrail()
    13.     {
    14.         // Set attributes for each weapon
    15.         switch (weaponID)
    16.         {
    17.             case 0:
    18.                 eventAttribute.SetVector3(positionID, allSwordData[0].position); // <<<<< And use their data here
    19.                 eventAttribute.SetVector3(colorID, allSwordData[0].color);
    20.                 break;
    21.             case 1:
    22.                 eventAttribute.SetVector3(positionID, allSwordData[1].position);
    23.                 eventAttribute.SetVector3(colorID, allSwordData[1].color);
    24.                 break;
    25.         }
    26.  
    27.         // Sends the event with the attributes over to be inherited
    28.         visualEffect.SendEvent(eventID, eventAttribute);
    29.     }
    30. }
    I think you should be able to send custom attributes as well. The UX is a bit unpolished, but I think you can do something like this:

    and then inherit the value:

    And then just use it whenever in your graph:

    *note that the location needs to be current ("source" is inherit from something - Spawner, GPU Event, etc., "current" is just "whatever the value is now")

    Then you only need to send this custom attribute over with the rest of your attributes:
    Code (CSharp):
    1.             case 0:
    2.                 eventAttribute.SetVector3(positionID, allSwordData[0].position); // <<<<< And use their data here
    3.                 eventAttribute.SetVector3(colorID, allSwordData[0].color);
    4.                 eventAttribute.SetVector3("CustomSwordAttribute", allSwordData[0].someCustomValue); // << You can use string, or do the Shader.PorpertyToID like with the other attributes and use an Int for the attribute name
    5.                 break;
    Even if that doesn't work, you can just send data and store it somewhere else that you are not using, like texIndex, velocity, targetPosition, etc.
     

    Attached Files:

  10. Qriva

    Qriva

    Joined:
    Jun 30, 2019
    Posts:
    1,296
    I never used custom attributes, but I think this cannot solve this problem. For sure I can send very base things like direction inside event, but other things are too specific I guess.
    This is true, I can store some values in scriptable objects, but the question is why it would be better to make it this way.
    I feel obliged now to explain approach I have in my mind :D

    There two swords with 3 animations (attacks) in combo. I would make two separate graph assets like this:
    These are not real graphs, I made them just now for sake of example.
    Sword1.png Sword2.png


    They are made from subgraphs representing single VFX like slash or stab and there might be different visual variants of them, like in this example: Simple and Amazing slash. (Let's assume first one is very basic and another one has sparkles and distortion effect. Btw they are not trails) Also there could be single event and some attack index used in switch to change emission of each system, but it does not matter now.

    I could make scriptable object and store Size or Tilt there, because they should be common to all graphs, but other params are very uniqe and I would need SO with literally all possible combinations to do that, furthermore I would need to expose all of them as properties in this graph.
    So I think it is better to create graph, set all desired params in subgraphs and send via event some very common properties like character rotation.For sure this will not produce the best possible performance, but it will be maintainable.
    Unless I missed something obvious, I don't know better way to do this.
     
  11. VladVNeykov

    VladVNeykov

    Unity Technologies

    Joined:
    Sep 16, 2016
    Posts:
    550
    Thanks for the example, @Qriva !
    I see, guess if there's not much overlap between the different effects, it would be hard to identify and reuse any modular elements (unless you can group them in a few different categories, like OnAttack1 and OnAttack3 in your example can have SimpleSlash be reused with the same properties.) But yes, I don't think you are missing anything obvious and using the VFX as subgraphs seems like a good workflow.

    The only other question would be how many of the 60+ effects will be playing at the same time and whether the performance gain of having fewer Visual Effect Components will outweigh the complications of setting them all in the same graph as subgraphs.
     
  12. Qriva

    Qriva

    Joined:
    Jun 30, 2019
    Posts:
    1,296
    There will be a few in the scene, but I think there will be one playing at the time. These swords are mostly for player and there is only one player luckily :)
     
  13. Voronoi

    Voronoi

    Joined:
    Jul 2, 2012
    Posts:
    584
    This is an interesting discussion to follow. It seems like the basic problem is how to give a player numerous weapon types, each with a unique VFX. In shuriken, this would be trivial, just make a new shuriken for each effect. I definitely sense the overhead of a VFX Graph just from using it in the Editor, so I would be wary to include dozens of graphs in my project.

    Looking at your graphs, my thought would be to try to design a single graph that is customizable into each affect and use Scriptable Objects that you can name with the unique settings required. I think it's possible to simplify the variables based on the examples you posted. For example, a gradient instead of 2 colors. Just sample parts of the gradient as needed and if it's one color, make it a one color gradient. Maybe the same with textures, use a blank texture for an effect that doesn't need it.
     
  14. Qriva

    Qriva

    Joined:
    Jun 30, 2019
    Posts:
    1,296
    I mean, definitely it would be possible to build SO system for properties, but I can see potential human mistakes, bugs and in this setup it would not be nice to use I guess.

    I need to test something first, but if the only overhead is caused by lack of "batching" between graphs, then it might make sense to actually split graphs into separate visual effect components in my case.
     
  15. laurentlavigne

    laurentlavigne

    Joined:
    Aug 16, 2012
    Posts:
    6,225
    Jumping onboard, trying to batch a bunch of smokes to drop the vfx graph overhead, can't get more than one event at a time in 2020.1.17 / urp 8.31
    here is my setup
    upload_2021-5-11_18-55-16.png
    Code (CSharp):
    1. using System;
    2. using System.Collections;
    3. using System.Collections.Generic;
    4. using UnityEngine;
    5.  
    6. public class VFXEventEmitter : MonoBehaviour
    7. {
    8.     public int groupID;
    9.     public float size = .5f, lifetime = .5f;
    10.     public static Dictionary<int, List<VFXEventEmitter>> emitters = new Dictionary<int, List<VFXEventEmitter>>();
    11.  
    12.     void OnEnable()
    13.     {
    14.         if (emitters.ContainsKey(groupID)==false)
    15.             emitters.Add(groupID, new List<VFXEventEmitter>());
    16.         emitters[groupID].Add(this);
    17.     }
    18.  
    19.     void OnDisable() { emitters[groupID].Remove(this); }
    20. }
    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4. using UnityEngine.VFX;
    5. public class VFXEventController : MonoBehaviour
    6. {
    7.     VisualEffect visualEffect;
    8.     VFXEventAttribute eventAttribute;
    9.     public int groupID;
    10.     static readonly int eventID = Shader.PropertyToID("smoke");
    11.     static readonly int positionID = Shader.PropertyToID("position");
    12.     static readonly int lifetime = Shader.PropertyToID("lifetime");
    13.  
    14.     void Start()
    15.     {
    16.         visualEffect = GetComponent<VisualEffect>();  
    17.         eventAttribute = visualEffect.CreateVFXEventAttribute();
    18.     }
    19.     void Update()
    20.     {
    21.         foreach (var e in VFXEventEmitter.emitters[groupID])
    22.         {
    23.             eventAttribute.SetVector3(positionID, e.transform.position);
    24.             eventAttribute.SetFloat(lifetime, e.lifetime);
    25.             visualEffect.SendEvent(eventID, eventAttribute);
    26.         }
    27.     }
    28. }
    result = only one vfx event emitter emits, always the last one in the event list
    expected = all vfx event emitters emit
     
  16. Qriva

    Qriva

    Joined:
    Jun 30, 2019
    Posts:
    1,296
    This is not a bug, its a feature ;)

    Single VFX can process only one event call per frame, that's the one of reasons why I made this thread.
    I think someone said it will be changed in future, but I have no clue about current state and if there is still plan to change this behaviour.
     
    laurentlavigne likes this.
  17. laurentlavigne

    laurentlavigne

    Joined:
    Aug 16, 2012
    Posts:
    6,225
    vlad said 21 got that fixed
    did you end up staggering particle emit? not hot when the game drops below 60Hz, suddenly a bunch of emitters stop emitting
     
    Qriva likes this.
  18. Qriva

    Qriva

    Joined:
    Jun 30, 2019
    Posts:
    1,296
    Oh, wait. It's fixed in 2021? Do I understand it correctly? Link or didn't happen :eek:
    Anyway I am not sure to understand what you mean. Btw, why you don't upgrade to 2020.3?
     
  19. VladVNeykov

    VladVNeykov

    Unity Technologies

    Joined:
    Sep 16, 2016
    Posts:
    550
    It will be for 2021.2, the PR is ready and awaiting to be merged.

    Here's the PR in question (I think this repo is public and you should be able to view it, but I might be wrong, in which case you'll just have to trust me :D). To avoid confusion, again this is slated to go in 2021.2.
     
    Qriva likes this.
  20. laurentlavigne

    laurentlavigne

    Joined:
    Aug 16, 2012
    Posts:
    6,225
  21. Qriva

    Qriva

    Joined:
    Jun 30, 2019
    Posts:
    1,296
    Ah yes... 2020 is horrible when it comes to reload, I mean 2020.3.4 fixed at least stupid things like HUGE lag when opening dropdown shader, but yeah...

    About smokes - assuming I need to spawn particles in different places, in version lower than 2021 the only way would be technique described in previous link, but I haven't tried it yet. The other way would be pseudo pool of VFX graphs and objects would request single burst of smoke to some controller with numerous visual effects and it would keep track which component was already requested during single frame.
    For constant spawn it is easy to start multiple spawns using single vfx for example like this:
    Code (CSharp):
    1. IEnumerator Start()
    2.     {
    3.         visualEffect = GetComponent<VisualEffect>();
    4.         eventAttribute = visualEffect.CreateVFXEventAttribute();
    5.         yield return null;
    6.         foreach (var e in VFXEventEmitter.emitters[groupID])
    7.         {
    8.             eventAttribute.SetVector3(positionID, e.transform.position);
    9.             eventAttribute.SetFloat(lifetime, e.lifetime);
    10.             visualEffect.SendEvent(eventID, eventAttribute);
    11.             yield return null;
    12.         }
    13.     }
    but in my knowledge there is no way to stop only one of them. It is possible to reenable them, but it cannot be done in single frame, so there would be gap in emission.
    Anyway as Vlad said, separate VFX graphs are not batched currently, so I am not sure it is worth to make overcomplicated systems in such a case.
     
  22. JJRivers

    JJRivers

    Joined:
    Oct 16, 2018
    Posts:
    137
    Another option to customizing which effects get played through switches would be to use bitmasks as event payloads or properties, the basic system design is the issue here, it's possible to make way for more modular effects than most people give it credit for. But in this case, since the op intends for these to be mostly player attacks with a couple playing at a time i think the proverbial gun is being jumped and humped mightily here.
    This being an old post probably means the op already settled on something but for posterity, first setup a representative test scenario for your usecase with a more expensive graph than you intend the final to be and see if you even need to "instance" inside one graph.
    And this next one is mostly speculation on my part but it seems part of the issue with multiple instances of a same graph is that they want to access the same UAV resources on the GPU leading to congestion. So 60 effects but only two or three playing at the same time could even be worse for performance in some cases if the effects are short lived.

    Test test test :)
     
  23. paolo-rebelpug

    paolo-rebelpug

    Joined:
    Sep 27, 2021
    Posts:
    22
    Hey there,

    sorry to jump on this but we are facing a similar situation. Hopefully someone can help us out :)

    Following what has been previously said about the smoke effects. In our game we have multiple structures ingames that when breaking start releasing a smoke effect. Currently we are sending an event to a global VFX passing the position of the requester so the smoke start spawning in the proper position. Anybody has found a solution on how to stop a precise one? Like keeping a reference which running spawner is referring to which requestor.

    Because the solution for us was to add a single VFX to every structure in the game so that we could easily control it. But from what you are saying we would be adding a lot of overhead.

    Thanks,
    Paolo
     
  24. koirat

    koirat

    Joined:
    Jul 7, 2012
    Posts:
    2,068
    I once tried this hack, and created one big VFX for all dust particles in my scene.
    It was a mistake, problem is with sorting, if you have one big VFX all your particles inside will be sorted with each-other but you will have one big VFX box that is going to be sorted with other VFX boxes in scene.
    There is not sorting between particles inside different VFX.

    So particles from far afar might appear before particles just in front of your camera.
     
  25. Qriva

    Qriva

    Joined:
    Jun 30, 2019
    Posts:
    1,296
    Unless they are opaque or they write to depth.

    In any case this thread is a bit outdated with recently introduced vfx instancing.
     
  26. JJRivers

    JJRivers

    Joined:
    Oct 16, 2018
    Posts:
    137
    As far as i'm aware this should* be fixed in the current LTS version to instance effects It's also possible today to in-graph pseudo-instancing with the repeat blocks

    *It is, but how effective it is i haven't tested personally hence you should try for yourself, are you having performance issues to multiple graphs or simply the graph is heavy by itself? Lots of transparents are always hard to make performant and all
     
  27. paolo-rebelpug

    paolo-rebelpug

    Joined:
    Sep 27, 2021
    Posts:
    22
    Thanks for the feedback!

    So one VFX is better than multiples for sorting and performances. And opacque is better then transparent for performance. We have no performance issues but since we are doing a big refactoring from old particle systems we want to make sure its efficient.

    What do you mean with pseudo-instancing? Because I need to find a way to say to a specific smoke to stop without stopping all the others.
     
  28. JJRivers

    JJRivers

    Joined:
    Oct 16, 2018
    Posts:
    137
    If i were you i would first try to see if just having multiple graphs is ok performance wise in some heavy worst assumption test setup (make sure the effect is opaque for that test).

    If it is not performant enough for you, there are multiple ways to "instance" a graph:
    • Send an event with position arg and spawn a single particle in that position that then uses GPU Events to spawn your actual event by inheriting the position then do what you would normally (this is enormously easier to achieve than the others, but has the limitation of a single event per frame)
    • Since the tools for the simple version are here now, use Tile/Warp blocks and send positions as an argument buffer (will make realtime control of effect slightly harder)
    • Modulo particleId's and assign in init then have them pick a position from an argument buffer on event
    • You could modulo particleId's and assign each to a particular position from a texture manually like i spent a week of my life on after all the debugging, affords supreme control since you make up the system and what it's internal logic is.

      These do require you to implement some form of tracking of each position you're spawning stuff in or bitmasking to kill effects that aren't active.

      Try the individual graphs first, you'll save a bunch of headaches if you don't hit perf limits with it, just test it with scenario that is your worst expected situation + 50% (because someone will paint 15 of them in some dank corner, trust me ;) )
     
  29. paolo-rebelpug

    paolo-rebelpug

    Joined:
    Sep 27, 2021
    Posts:
    22
    Thanks for the explaination, it seems really complex to implement. What I don't understand 100% is how to kill it after, I get you found a way to track it if it's active but how do tell it to stop?

    We ended up using the old partycle system on those systems that need to be stopped and cannot be looped. While for those that can be easily looped we keep the vfx on a single burst with a short life time so that I manually trigger it every x time and if I need to stop then I won't send the event anymore and it will dissipate with the lifespan left (works for smoke).
     
  30. tzxbbo

    tzxbbo

    Joined:
    Dec 14, 2019
    Posts:
    94
    upon the spawn system, there's also a Stop node, maybe connect another event to it could be used as a switch for stopping, but I don't know I just started learning vfx