Search Unity

Feedback Wanted: Visual Effect Graph

Discussion in 'Visual Effect Graph' started by ThomasVFX, Oct 21, 2018.

Thread Status:
Not open for further replies.
  1. Olmi

    Olmi

    Joined:
    Nov 29, 2012
    Posts:
    1,553
    @pbritton I'm not sure if it's the right way to do it as there's no manual pages (or details) for this stuff, but I just by accident once tried a few spawn blocks and when I added a Constant Spawn Rate and a Loop and Delay block, I got a particle system that lasts for certain amount of time and does not repeat.

    Set the loop count to 1, duration to what you need and delay to 0 to start it right away. That will give you a particle system that lives for a period of time.

    I guess that second block (loop...) acts as some kind of modifier as it does not work by itself? Correct me if I'm wrong.
     
  2. Kazko

    Kazko

    Joined:
    Apr 2, 2014
    Posts:
    82
    I am interested in this info as well. Btw, i think the FPS Sample project might have a thing going on that you might be looking for (projectile impact hits). I would however want to know if this is possible for looping systems. Right now I have instances of Visual Effects, and each instance brings the framerate down a bit (on Android). This drop is not related to particle count. One system with thousands of particles renders/calculates faster than 5 systems with 10 particles each.

    I would like to know if there is a way of using one instance that somehow manages isolated particle outputs with their respective properties (for example being able to adjust particle count on each output). Maybe in future?
     
  3. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Hi, I have positions, colors and normals being read from a buffer into the VFX graph, and rendered as lit quads which face the camera position. However, I would like to point the normals of the quads in the direction fetched from my normals attribute map, and currently cannot find a way to do this. Basically, I want to light the particles based on my world space normals, not based on their orientation or a normal map that they might have on them. Is there any way of doing this?
     
  4. pbritton

    pbritton

    Joined:
    Nov 14, 2016
    Posts:
    160
    I did more digging based on what you referenced and I stumbled upon a thread in which selecting the Spawn block gives you options in the inspector. From that I changed the duration and loop count from infinite to constant. That did the trick.
     
  5. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    569
    In the latest version of VFX Graph you can send particle parameters into a shader graph.
     
  6. alexandredizeux

    alexandredizeux

    Joined:
    Sep 3, 2019
    Posts:
    6
    Will it be possible to load custom vector field using VFX graph at some point ? because at the moment the only solution seems to use paid add-on like VectorayGen or MegaFlow.
    Thanks
    Alex
     
  7. Olmi

    Olmi

    Joined:
    Nov 29, 2012
    Posts:
    1,553
    @alexandredizeux isn't it more like a question of not having a tool that generates vector fields? You can use any texture3D type as an input, and you are good to go.

    Create a compute shader that renders a vectorfield and use the RenderTexture as an input for VFX Graph texture3D node.

    But of course if you want fancy vector fields, then it's more a question of having the simulation available so that you can render something out of it...
     
    Last edited: Oct 24, 2019
  8. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    I'm thinking they probably want a free importer for existing file formats such as .FGA?

    Unity so far resisted supporting that format themselves, and instead wrote an importer->texture3D script for .vf format files, and a .vf exporter for Houdini.

    .vf format:

    https://github.com/peeweek/VectorFieldFile

    This has the houdini exporter in it:

    https://github.com/Unity-Technologies/VFXToolbox/tree/master

    Someone did post some code for turning FGA into Texture3D a year ago but I've never tried it and someone would need at least a little scripting knowledge to be able to get the assets in and out of this code: https://realtimevfx.com/t/fga-to-3d-texture-question/6305
     
  9. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    995
    I tried this but couldn't get it to work.
     
  10. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    So it appears to only allow a single float? And how do I use these in the shader graph since it doesn't have normal inputs on the VFX shader graph shader? Or am I forced into writing manual HDRP shaders which is basically hell on earth to maintain?

    Can we write custom blocks yet? Because basically an "Set Normal From Map" block is what I really want..
     
  11. Olmi

    Olmi

    Joined:
    Nov 29, 2012
    Posts:
    1,553
    @elbows interesting! Thanks for those links, I haven't had time to dig into those GitHub repos etc. that much.
     
  12. JulienF_Unity

    JulienF_Unity

    Unity Technologies

    Joined:
    Dec 17, 2015
    Posts:
    326
    @jbooth if you dont use pivot.z, angle.z or scale.z in your particle system, you can use axisZ attribute to set the direction of the normal (probably inverted).

    Else as said above, you can expose a vector from shader graph and pass your normal from VFX Graph into shader graph (from 2019.3)
     
    Last edited: Oct 25, 2019
  13. alexandredizeux

    alexandredizeux

    Joined:
    Sep 3, 2019
    Posts:
    6
    yeah I'm using the GameDev ROP Vector Field in houdini to export .fga vector field but I can't (or I don't know) use it in VFX graph editor :/
     
  14. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    Well I am not a Houdini user at the moment, but my understanding is that you should be able to use the tool I mentioned earlier to export .vf files instead of .fga files. Then Unity/VFX Graph can import these and turn them into Texture3D assets that can be used with vfx graph.

    https://github.com/Unity-Technologies/VFXToolbox/tree/master/DCC-Tools~/Houdini
     
  15. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    995
  16. alexandredizeux

    alexandredizeux

    Joined:
    Sep 3, 2019
    Posts:
    6
    As this is for data science visualization, I finally moved to unreal and make it work in 10min following this
     
  17. sergiusz308

    sergiusz308

    Joined:
    Aug 23, 2016
    Posts:
    235
    Why my question about not working soft particles is ignored? I provided information how to replicate it - after recent update to 2019.2.10 it still does not work - particles disappear regardless of the parameters provided.
     
  18. JulienF_Unity

    JulienF_Unity

    Unity Technologies

    Joined:
    Dec 17, 2015
    Posts:
    326
    Because we haven't been able to reproduce it. We need a clear repro case. Can you file a bug with a project reproducing the issue and we'll take a look?
     
  19. JulienF_Unity

    JulienF_Unity

    Unity Technologies

    Joined:
    Dec 17, 2015
    Posts:
    326
    Olmi likes this.
  20. alexandredizeux

    alexandredizeux

    Joined:
    Sep 3, 2019
    Posts:
    6
    I've tried to import .vf vector field from houdini but couldn't figure out how using the VFX graph editor
     
  21. thelebaron

    thelebaron

    Joined:
    Jun 2, 2013
    Posts:
    857
    Is this expected behaviour?

    In 7.1.2 (& HDRP) trying to use VisualEffect.SetVector3 it appears I cant access property by the name in the graph, I have to use the name that shows up in the debug inspector.

    So my graph property is:






    But in the script the top line errors out, the bottom name works, only found it from enabling debug on the inspector(below)

     
    Last edited: Oct 25, 2019
  22. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    Does it still do that if you give the exposed property a different name? Maybe position is a reserved word that causes the problem? Just a thought, I cannot check right now.
     
  23. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    OK I had a little go on later 7.x branch and behaviour is the same, but the answer is that its dependent on the type of exposed property. If you create a property of type Vector3 then this wont happen, but if its of type Position or Direction, or possibly some others, it will.

    The normal inspector will also give a strong clue about when this is happening, because it will have position, direction or whatever next to the values and checkbox instead of these just being next to the property name. eg in this example property named Position is of type Vector3 and does not have the issue you mentioned, the fish related ones do, and here is how they look.

    Screenshot 2019-10-26 at 00.23.24.png
     
    thelebaron likes this.
  24. thelebaron

    thelebaron

    Joined:
    Jun 2, 2013
    Posts:
    857
    interesting find(and thanks for looking into it), dont suppose someone official could explain why naming is a little funky is the case for these specific types?
     
  25. thelebaron

    thelebaron

    Joined:
    Jun 2, 2013
    Posts:
    857
    Also I cant seem to find TriggerEvent or GPUEvent(Downloaded the example from a few pages back on trails)? Are these under a new name or something?

     
  26. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    In main Unity preferences there is a Visual Effects section, you need to go to that and enable the Experimental Operators/Blocks option.
     
    thelebaron likes this.
  27. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    General feedback from my first week of using this:

    - Not being able to collapse the properties window is really annoying, as it gets huge quick, and the scaling of it doesn't work right.
    - I didn't realize that you can double click on a node to collapse it down really small. I was instead using the open/close dialog, which doesn't save enough space to feel useful to me.
    - I'm really glad you didn't choose the "Values extend out of the nodes to the left" model for user entered values that the shader graph uses, as that wastes a ton of space. Space is extremely tight on this UI when running on a laptop.
    - Comments above nodes would be super useful
    - I feel like there are lots of things in the block area that should be part of the graph. For instance, SetColorFromAttributeMap would be better as a "GetValueFromAttributeMap" node which gets plugged into a SetColor block. This would seem more logical as it works for any type of data, not just known types. It would also reduce the node count when searching for nodes, as you don't need all those variations of blocks anymore.
    - I personally prefer more dense nodes with options than lots of individual nodes. For instance, A Set Block instead of SetColor, SetPosition, etc. This is more personal preference to make the node list more manageable.
    - Having the classes be internal means it's really hard to add custom nodes and blocks. You can do it, either by modifying source or hacking the assemblies to give you access (this is what all asset store authors have to do to work with the shader graph now)
    - Having external shaders be directly tied to shader graph shaders means another blow to being able to write custom shaders in Unity without going through a graph.
    - OMG some documentation and it's own forum would be nice.

    Overall I really like it, though it's more heavy on blocks than nodes than I would have expected. I would have thought that blocks would only have inputs for each stage (init, update, render) rather than full on effects like turbulence. When pushed out into the nodes it's going to be more verbose, but far more flexible, as something like turbulence is applied completely in the update block to modify the position - but broken out into nodes it could be used to modify arbitrary data and plugged into any output (normals for turbulent lighting, etc).
     
    hippocoder and andybak like this.
  28. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    If I understood you correctly, this thing is called the Blackboard, and there is a Blackboard toggle button near top right of the VFX Graph editor window which will make it disappear/reappear.
     
  29. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    jbooth likes this.
  30. DylanF

    DylanF

    Joined:
    Jun 25, 2013
    Posts:
    55
    Many missile trails possible?

    Is it possible to use a vfx graph for sustained emissions from multiple positions? I'm thinking of something like missile smoke trails, without instantiating a vfx graph instance for each one since there could be hundreds.

    My hope was that this graph and c# code would work, but it's actually not emitting anything. This is on 2019.2.10f1 with VFX Graph 6.9.2.

    Code (CSharp):
    1.         var x = vfx.CreateVFXEventAttribute();
    2.         x.SetVector3("position", new Vector3(3f, 3f, 3f));
    3.         vfx.SendEvent("BeginSpawning", x);
    4.  
    5.         var y = vfx.CreateVFXEventAttribute();
    6.         y.SetVector3("position", new Vector3(3f, 3f, -3f));
    7.         vfx.SendEvent("BeginSpawning", y);
    8.  
    9.         while (true) {
    10.             x.SetVector3("position", new Vector3(Random.Range(-3f, 3f), Random.Range(-3f, 3f), Random.Range(-3f, 3f)));
    11.             y.SetVector3("position", new Vector3(Random.Range(-3f, 3f), Random.Range(-3f, 3f), Random.Range(-3f, 3f)));
    12.             yield return null;
    13. }
    missileSmoke_VFXGraph.jpg
     
    Last edited: Oct 29, 2019
  31. alloystorm

    alloystorm

    Joined:
    Jul 25, 2019
    Posts:
    88
    Is it possible to set particle initial positions in-between frames?

    What I mean is when the spawner is moving very fast, you can clearly see the spawn shape of particles for each frame. It would be nice if the spawner is able to interpolate spawn positions between frames to create a smooth transition between each frame.

    For example, you spawn particles in a sphere shape. When you move spawner from position A to B in one frame, you can clearly see particles in 2 sphere shapes. It would be a lot more realistic if the particles can spread evenly across space between A and B.
     
  32. alloystorm

    alloystorm

    Joined:
    Jul 25, 2019
    Posts:
    88
    Right after posting this I got an idea.

    You can achieve this with the "Add Position Random" block and pass in the velocity each frame.

    Screen Shot 2019-10-29 at 11.51.33 am.png

    And in your game object add a script that does this:
    Code (CSharp):
    1. public class SetVelocity : MonoBehaviour
    2. {
    3.     public VisualEffect vfx;
    4.     internal Vector3 pos = Vector3.zero;
    5.  
    6.     void Start()
    7.     {
    8.         pos = transform.position;
    9.     }
    10.  
    11.     void Update()
    12.     {
    13.         vfx.SetVector3("Velocity", pos - transform.position);
    14.         pos = transform.position;
    15.     }
    16. }
     
    Last edited: Oct 29, 2019
  33. alloystorm

    alloystorm

    Joined:
    Jul 25, 2019
    Posts:
    88
    Another question, is there any way to recursively calling a system?

    Say I want to spawn 2 new particles and assign a random velocity for each every time a particle dies. And I want to repeat the process a random number of times.

    This should create a cool spark effect.
     
  34. Deive_Ex

    Deive_Ex

    Joined:
    Dec 30, 2014
    Posts:
    25
    Hey, I have some feedback!

    I've been messing with the VFX Graph for a week now and these are my immediate concerns.

    1. Although the VFX Graph and Shader Graph are different tools, I don't see any consistency between the nodes of both systems.
    From my perspective, they should look and work in similar ways, since they both use the same "Graph" interface. What I mean by that is that while they do different things, nodes like "Time" and "Multiply", which have basically the same functionality in both systems, have totally different looks. I think since they're both tools inside the "Unity Editor", they should have similar looks, so if a person that's been using Shader Graph decides to use VFX Graph or vice-versa, they won't feel lost when searching/using basic nodes (I know I felt lost when I jumped from Shader Graph to VFX Graph).

    Shader Graph:
    VFX Graph:

    2. When searching for nodes, the number of nodes that is shown is absurd! And many of the options are the EXACT same node, but with a different configuration!
    For example, the "Set Position" node/block and the "Add position" node/block are the exact same! But with "composition" option in the inspector changed! You either add a global "Set" and "Add" nodes and let the user choose the attribute through the inspector or you add just show "Set" node as the only option, but let the user choose which attribute/composition will be used. Showing so many of the same node/blocks is no good.
    ALSO, It honestly took my a while to realize that I could change these options through the inspector...

    3. The Blackboard fileds don't resize very well with the blackboard.
    On Shader Graph, the fields resize very nicely, but on VFX Graph, it seems like they have a minimum width, and setting the blackboard to a smaller width creates a horizontal scrolling bar... I get that some fields needs more space, but even when seeting a simple bool/float, they seem to need a huge amount of space...

    4. Make it so that Blackboard parameters are exposed by default?
    That might be a little personal, but... Since we already have Inline parameters, I don't really see much reason to make the blackboard values NOT be exposed by default. Yeah, I know people can use the backboard to organize their variables instead of finding them in the middle of the graph, but... the way I see things, inline values are like constants, and Blackboard values are values you want to play/modify. And that means they'll probably need to be exposed anyway. It's just annoying having to always check the "exposed" box every single time I create a new variable.

    5. Change the axis orientation of certain blocks like the "Position(Circle)" block.
    So, this option might already exist or maybe there's a way to calculate this, but I couldn't change the orientation of these blocks without rotating the object itself. I wanted to make a spawn circle that faced upwards, but it always faces the Z axis. This is bad because If I wanted to make two spawn circles that faced two different directions, I wouldn't be able to use the same Graph.

    6. The box that lets you rename Systems are weirdly placed below the blocks.
    The blue-lined box. It could be more clear and easier to click (maybe like the "path" input in Shader Graph's Blackboard?)
    upload_2019-10-29_14-41-29.png

    7. Add a "Expand/Collapse all" button on the Blackboard.
    When you start to work with lots of parameters, it can be pretty annoying to find a certain parameter when all of them are expanded.

    Well, I guess that's about it for now.
     
    andybak likes this.
  35. Loumiskme

    Loumiskme

    Joined:
    Feb 17, 2019
    Posts:
    2
    I'm facing the same issue here. The shape position shape doesn't seems to follow the orientation of the shape when set in "World context"
    upload_2019-10-31_17-8-43.png
     
  36. bjedlowski97

    bjedlowski97

    Joined:
    Feb 13, 2018
    Posts:
    46
    Is it possible to get one particle system in the vfx graph to be drawn towards another system's particles, or just any way to get particles to drawn to another particle
     
  37. Olmi

    Olmi

    Joined:
    Nov 29, 2012
    Posts:
    1,553
    @bjedlowski97 If you are looking for interaction of individual particles, I don't think it's possible at the moment so that each particle would move towards a particle in another system. But you could maybe move a target close to where your particles are and then make other particles go towards it.
     
  38. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    run both simulations in the graph, take the current position of each and lerp them before going to the output block.
     
  39. bjedlowski97

    bjedlowski97

    Joined:
    Feb 13, 2018
    Posts:
    46
    and how do i get the current positions of each, still kind of new to the vfx graph
     
  40. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Get Position node set to current..
     
  41. bjedlowski97

    bjedlowski97

    Joined:
    Feb 13, 2018
    Posts:
    46
    ok, thank you. I thought it might have been that but wasn't sure, since I thought that would only reference one of the systems
     
  42. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    Yeah - Come to think of it I don't think it's going to work because each particle is processed in parallel on the GPU, so even if you put two systems into one graph I think you'll only have access to your current state. This is more a limitation of the graph interface, because you could write a system to do this, but I suspect it would significantly make the graph more complex and less intuitive if you allowed it because you'd have to have the user define the attributes and available state instead of having it managed by the system.
     
  43. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    569
    And slower by a significant factor. Particle to particle interaction surely requires a quadratic increase in processing. Every particle potentially has to query every other particle!

    There are ways to have some limited inter-particle interaction. Check out https://github.com/thinksquirrel/fluviofx
     
  44. Olmi

    Olmi

    Joined:
    Nov 29, 2012
    Posts:
    1,553
    It's still possible to do some (very limited) attraction style effects by storing the position to some value that can be inherited correctly by a system that is connected with a GPU event. I made a quick test to demonstrate this; Invisible system A particles are moving and the positions are inherited by system B via a GPU event every frame. Then the position of system B particles is interpolated towards the positions inherited from system A.

    You can create some sort of effects with this kind of hacky solution but it's not easy to get fluid motion so that the particles would somehow float towards the other particles.

    It's better to write some n-body style simple simulation in compute shader and then use VFX Graph to render it I think. Even without any kind of acceleration structures you can easily move 500k-1mil particles that test against each other...

     
    Last edited: Nov 3, 2019
    Lars-Steenhoff and jbooth like this.
  45. jbooth

    jbooth

    Joined:
    Jan 6, 2014
    Posts:
    5,461
    No, the way I was talking about it was not accessing random particles of another system- rather, computing position values for two sets of particles and then interpolating between them. I do something similar to this in some stuff I'm currently working on where a set of points is stored in a texture; reading the positions from two of those textures and interpolating them..
     
    andybak likes this.
  46. bjedlowski97

    bjedlowski97

    Joined:
    Feb 13, 2018
    Posts:
    46
    And what did you do to achieve this exactly, I am still trying to teach myself what I can about the vfx graph. I may have an idea on how to do this, but not sure if it will actually work.
     
  47. Elliott-Mitchell

    Elliott-Mitchell

    Joined:
    Oct 8, 2015
    Posts:
    88
    Is there a way to dynamically update the capacity at run time? Do we have to mess with set lifetime, or is there a way to override capacity?
     
  48. protoben

    protoben

    Joined:
    Nov 11, 2013
    Posts:
    39
    is there a way to alter existing particles as they pass through a defined area with some of their normal traits? I can make a kill block to kill the particles as they pass by an area, I can make a Vector Field to alter their flight path, but is there anyway to change other particles instead? for example, I would like any particle that passes through a certain area to change color and size.

    my current hack is to kill particles then spawn from death new particles with almost the same traits but that is a but time consuming.
     
  49. Olmi

    Olmi

    Joined:
    Nov 29, 2012
    Posts:
    1,553
    @protoben you could define an area yourself using branching. Test if a particle position or some other value is within your desired limits and then do something based on that information.

    You can pull quite a few different tricks if you combine a few branching operations. You could test for various geometric shapes when you define them with math.

    Then you could do anything you want, add velocity, change direction, adjust size, color and so on.

    Here's a small example:

    branching_1.PNG

    Graph (connected to a Set Color block)
    branching_2.PNG
     
  50. sergiusz308

    sergiusz308

    Joined:
    Aug 23, 2016
    Posts:
    235
Thread Status:
Not open for further replies.