Search Unity

Feedback Wanted: Visual Effect Graph

Discussion in 'Graphics Experimental Previews' started by ThomasVFX, Oct 21, 2018.

  1. moondust-games

    moondust-games

    Joined:
    Jul 21, 2013
    Posts:
    20
    I'm having a problem with the loop and delay block. In the editor it works as expected and the first time I build the app to ios/iphone it works. But on subsequent builds, with no changes being made to the VFX Graph, the loop block seems to be ignored and I get a constant stream of particles. I did reset it (can't remember now - think it was by changing the product name for a new app) but it happens each time from the 2nd build onwards.
     
  2. moondust-games

    moondust-games

    Joined:
    Jul 21, 2013
    Posts:
    20
    Another problem I noticed, which might be device related, is that when Quad Output has sorting set to auto or on then the app crashes at the moment the VFX starts on an older iPad mini 4 but not on my iPhone 7. If I set particle options > sort to NO then it doesn't crash. Both devices have latested iOS installed as at today. Using Metal. Let me know if any other info will help. Took me hours to track this one down.

    edit: using LWRP for both situations above
     
  3. protoben

    protoben

    Joined:
    Nov 11, 2013
    Posts:
    34
    Thanks Andy. Do you have any more information on what you mean by switching them on and off with conditionals?

    I haven't found much in the way of controlling VFX graphs via scripted events except for controlling Parameter values and starting and stopping whole Systems.

    Are you saying there is a way to put an "if this" statement between the Update and the Output block? Any additional information would be appreciated.

    I know there are all the Logic Operator nodes but I also haven't figured out a single block to connect them to to help control the system. Perhaps I just haven't yet figured out what to connect Purple lines to.
     
  4. LennartJohansen

    LennartJohansen

    Joined:
    Dec 1, 2014
    Posts:
    2,292
    Even if you could assign a mesh per particle it would increase drawcalls a lot. The particle system is using instancedIndirect rendering internally. That allows for rendering a lot of instances with the same mesh in a single drawcall, having multiple meshes would have the system prepare multiple lists of instances and draw them each with a drawcall. Distance sorting would also be hard to do since all particles of a single mesh would draw at the same time.
     
  5. timpernagel

    timpernagel

    Joined:
    Feb 5, 2016
    Posts:
    3
    How is the best practice to pass f.e. a buffer of arbitrary 3d-positions from outside of the graph to a block? Is there maybe a sample-project where this has been done or can someone provide a code-snippet for this?

    And one fundamental question regarding this thread: Is this thread the best place to ask those kind of questions atm? I have the feeling that this will result in losing many important answers in the stream of posts.
     
  6. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    161
    Usually packing the xyz into the rgb of a texture. I've got a few examples of this on my Github. https://github.com/IxxyXR Text VFX or Voxel VFX are probably worth looking at. Text VFX is simpler but Voxel VFX scales better to lots of points.

    Also predictably Keijiro has done some amazing work in this area. https://github.com/keijiro (although his are slightly more complex examples to learn from as he pairs the VFX Graph with compute shaders to speed up some transforms to the data before packing it into a texture)

    I agree but when I raised this one of the moderators replied that long threads like this were better than separate short posts. I have no idea why.
     
    timpernagel likes this.
  7. AnxoMobgen

    AnxoMobgen

    Joined:
    Jan 30, 2017
    Posts:
    1
    I have a question about the Sequential 3D (Math operator) because I am trying to create a perfect voxelized model using partcleID or a pCache file but at the end I only get a cube with the particle repeated or the irregular shape of the model repeated a few times.

    Is there a node for conecting or extract the right information?

    In case that what I am saying is not clear I am adding the exactly example of why I am talking about (second 23).



    Thank you so much for help and attention.

    Cheers.
     
  8. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    161
    Trying to remember how I did it. There's a nice simple, example of conditionals working in this post: https://forum.unity.com/threads/feedback-wanted-visual-effect-graph.572110/page-7#post-4144600

    I think I did it without conditionals - just two outputs with a set position on each with slightly different calculations feeding into the position.
     
  9. moondust-games

    moondust-games

    Joined:
    Jul 21, 2013
    Posts:
    20
    Using the colour picker tool on a gradient point in VFX Graph crashes Unity. Not on the first one but after a few.

    Edit: I think this only happens with the Editor paused...maybe not a good idea to edit VFX in this situation?
     
    Last edited: Jun 18, 2019
  10. protoben

    protoben

    Joined:
    Nov 11, 2013
    Posts:
    34
    Is there still an active discord server for vfx graph discussions? The link posted in this thread previously has expired.

    Thanks
     
  11. VladVNeykov

    VladVNeykov

    Unity Technologies

    Joined:
    Sep 16, 2016
    Posts:
    39
    Atan2, Atan, Asin, and Acos nodes are being worked on
    SubGraphs are implemented and in the process of being merged (if you are on 19.2, they should be available in the 6.8 package, or in latest master on the Git SRP repo)
    We don't have out-of-the-box custom nodes or current camera matrices at this point.

    @protoben @andybak Correct, N meshes in the same output is not doable at the moment.

    We had a bug with delay, maybe you were running into this:
    https://issuetracker.unity3d.com/issues/spawner-delay-in-bursts-no-longer-works

    Unfortunately GPU sorting has a bunch of known issues on OSX and iOS, from particles not appearing, to flickering, and the crash sounds related: https://issuetracker.unity3d.com/is...-results-when-sorting-is-set-to-auto-slash-on

    To chime in, you can achieve what @andybak is suggesting by putting a Set Alive block on any Output and control via your logic which outputs (or "mesh", since you are faking using multiple meshes) is shown, and which is hidden. This will basically allow you to selectively turn the rendering on/off on any output.
     
  12. protoben

    protoben

    Joined:
    Nov 11, 2013
    Posts:
    34
    Thanks Vlad. I’ll try that tonight.

    New question. Is there anyway to get an active read out of the value in a field in a node or block?

    Perhaps a type of node whose only job is to actively display the value of a parameter?

    Alternatively is there a way to refresh the values in a block so I can see the current or live value?
     
  13. 3d_Artist1987

    3d_Artist1987

    Joined:
    Jul 9, 2012
    Posts:
    710
    Hi,

    How to generate position from texture?

    Thanks
     
  14. timpernagel

    timpernagel

    Joined:
    Feb 5, 2016
    Posts:
    3
    Check out this repo, the graph includes a soultion for this task:
    https://github.com/keijiro/Smrvfx
     
  15. VladVNeykov

    VladVNeykov

    Unity Technologies

    Joined:
    Sep 16, 2016
    Posts:
    39
    Currently, no, but there are plans for some debug options to be able to get more feedback in the graph. In the meantime, you can identify the value you want (e.g. the particle's velocity) and output it as a color and either eyeball it (if you need a rough idea) or get a frame capture and see the actual value in another program.
     
  16. keenanwoodall

    keenanwoodall

    Joined:
    May 30, 2014
    Posts:
    560
    I'm trying to assign a custom depth buffer to a camera parameter but whenever I do an error is thrown:

    ArgumentException: Object of type 'UnityEngine.CustomRenderTexture' cannot be converted to type 'UnityEngine.Texture2D'.

    Any ideas?

    upload_2019-6-20_14-40-20.png
     
  17. shadewing23

    shadewing23

    Joined:
    Jan 30, 2019
    Posts:
    6
    There's a way to made the Orbital Velocity in VEG? I did try it with Velocity(Tangent) and it didnt came out like what I think.
     
  18. francois85

    francois85

    Joined:
    Aug 11, 2015
    Posts:
    644
    First of all amazing work by the Unity dev team !! This tool is super awesome I can’t wait to see what you guys will add next.

    My question is this. I’m trying to find a way to represent hair in the effect graph. Is there a way to add xgen xpd into the effect graph. If not I’m open to any suggestions.
     
  19. 3d_Artist1987

    3d_Artist1987

    Joined:
    Jul 9, 2012
    Posts:
    710
  20. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    161
    I found Velocity Tangent a bit strange also but I didn't spend enough time on it to figure out if I was the problem or not. I'll post my experiment when I get a chance.

    Before I realised there was a specific node I tried to approximate orbital velocity using cross products with some success: https://forum.unity.com/threads/feedback-wanted-visual-effect-graph.572110/page-9#post-4344808
     
  21. francois85

    francois85

    Joined:
    Aug 11, 2015
    Posts:
    644
    Thanks for the reply, very cool example. I couldnt find a solution to my xpd import question, could you point me to it ?
     
    Last edited: Jun 21, 2019
  22. francois85

    francois85

    Joined:
    Aug 11, 2015
    Posts:
    644
    Anyone know what this error is

    Shader error in 'Hidden/VFX/System 2/Lit Quad Output': 'InitBuiltinData': cannot convert from 'const struct PositionInputs' to 'float' at /VFXGraph/Library/PackageCache/com.unity.visualeffectgraph@5.13.0-preview/Shaders/RenderPipeline/HDRP/VFXLit.cginc(62) (on d3d11)

    Compiling Vertex program
    Platform defines: UNITY_ENABLE_REFLECTION_BUFFERS UNITY_USE_DITHER_MASK_FOR_ALPHABLENDED_SHADOWS UNITY_PBS_USE_BRDF1 UNITY_SPECCUBE_BOX_PROJECTION UNITY_SPECCUBE_BLENDING UNITY_ENABLE_DETAIL_NORMALMAP SHADER_API_DESKTOP UNITY_LIGHT_PROBE_PROXY_VOLUME UNITY_LIGHTMAP_FULL_HDR
     
  23. keromonkey

    keromonkey

    Joined:
    Jan 16, 2019
    Posts:
    12
    Hello, I'm working with creating particle grass (since the unity terrain's built in grass shader is not yet functional). And it strikes me that the demo is built using a pcache and a height map texture.

    I wanted to request a visual effect graph node that can get pixels/values from a texture. The idea is to use it as input for the number of grass particles needed (texture height x texture width) as well as the height to place the grass at from the individual pixels themselves.
    [And tutorial us plz X'D]

    Another consideration might be building a terrain node that can grab a range of values from a terrain drag-n-drop (or even programatically) for ease of inserting into a VFX particle system.

    Even though many things aren't functioning up to full par yet (who am I to talk as a noob), great job on the updates being made. I appreciate the sheer power & versatility which this visual effect graph seems capable of once the tools are mastered.

     
    Last edited: Jun 23, 2019
  24. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    161
    Unless I misunderstand you we already have that with "Sample Texture 2D". It's how I'm passing in arrays of values in several of my VFX experiments. For example: https://github.com/IxxyXR/VoxelVFX

    Also check out a lot of the recent Keijiro stuff on Github.
     
  25. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    161
    @VladVNeykov (Hi by the way - we met at the Fusebox a few weeks ago!)

    What's the performance like when using this technique?

    i.e. if you spawned a lot of particles only to set Alive=false at some point in the pipeline, how much of the rendering cost do you pay? Does it make a big difference in which context you put your "Set Alive" block?

    And is "Set alive" the same as using a Kill block?
     
  26. shadewing23

    shadewing23

    Joined:
    Jan 30, 2019
    Posts:
    6
    I did try from yours too, but it wont work for some reason
     
  27. Takamba

    Takamba

    Joined:
    Jan 11, 2018
    Posts:
    24
    Hi

    I would like to spawn 50000 particles on a point cache, then wait for a certain amount of time.
    After this delay, the particles should get attracted by a signed distance field.

    How can I realise such a delay?

    Any hint is much appreciated.
     
  28. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    161
    My first thought would be to use a tweaked time value as the blend between the SDF and a zero vector. Does that make enough sense?

    EDIT - just realised that "Conform to SDF is a block rather than a node so you'd have to do some trickery with get and set position as well. Damn that distinction between nodes and blocks!
     
  29. Takamba

    Takamba

    Joined:
    Jan 11, 2018
    Posts:
    24
    I thought I could just compare the TotalTime to a delay with a compare node, and drive the attraction force of the Conform to SDF Block, but that doesn't work.
    With an attraction force of Zero, the particles don't even spawn.
    Compare.JPG
     
  30. NoizedropBP

    NoizedropBP

    Joined:
    Mar 13, 2015
    Posts:
    5
    Honestly i dislike the VFX Graph at this point... i can now make beatiful swarm particles...but when i'm trying to build serious vfx for games i'm constantly stuck with some bullshit.

    it is way too much effort for simple system and way too much "working around" to get the desired results...i currently can't get a simple sparks system in world space running as nothing spawns as soon as i'm switching from local to world space. i'm sure it works somehow...by why the hell does something simple like this need workarounds to function properly?

    there is not enough documentation on the VFX Graph to make it the mandatory thing for the HD Renderpipeline.

    The Usability and especially Intuitivity of the VFX Graph currently is absolutly horrendous
     
    Takamba likes this.
  31. DuvE

    DuvE

    Joined:
    May 22, 2016
    Posts:
    109
    I have two questions:

    1. Is there a way to do something like submitters? For example a simple spawn of particles on particle death?
    2. I don't quite get it, how can I change spaces Local to World for example? Didn't found transform matrices, and nodes "World To Local" don't work, or at least I don't know how to apply them properly.
     
  32. JackDDeane

    JackDDeane

    Joined:
    Aug 6, 2018
    Posts:
    19
    1. Yes, you can spawn particles in this fashion, but first you must enable experimental nodes which I belive you do in Preferences > Visual Effects and tick the box next to 'experimental operators/blocks'. Then you add add a GPU event in your update block, in your case it would be 'trigger event on die' and you plug the output of that into a GPU event spawner as shown in this documentation.

    2. Local and world space can be changed either in single blocks or on the whole context menu. For example, if you look at your Initialize block you will see an icon in the top right corner that says 'local', fi you clock on that it will change to 'world' everything in the intialize block will be using world space. Alternatively if say you just want your position to use world space you will see a little 'L' symbol on the block, click on that will change it to a 'W' and use world space.

    Hope that helped!
     
    Last edited: Jun 29, 2019
  33. DuvE

    DuvE

    Joined:
    May 22, 2016
    Posts:
    109
    Thx, the second question was not quite about this. For example, if the whole VFX simulated in World Space (just like you answered, I've changed it from Local to World), how can I get the pivot point of the whole VFX? I tried to just convert vector000 from World to Local, but these nodes "WorldToLocal" just don't work. They have 3 weird outputs, are these matrices or what?

    http://prntscr.com/o89r59
     
  34. protoben

    protoben

    Joined:
    Nov 11, 2013
    Posts:
    34
    Further related to the last bug / feature request I posted about needing to be able to see the current values of floats. This is what I see as a read out in the VFX Graph and the problem of the floats not being updated:

    upload_2019-6-30_13-2-47.png

    The LeapXMin and LeadXMax values for example are actually being passed to the VFXGraph. I've tested by feeding them right into the spawn rate and can see they aren't really still zero. The VFX Graph's output looks right, but all read outs attached to that graph are based on the blackboards original value for those Exposed Floats.

    upload_2019-6-30_13-8-29.png

    In this case if the min and max are zero it shows NaN. In the end, I can get the graph to make the visual output I need but this is a very difficult system to trouble shoot in currently so I'm posting this to bump the importance of this feature request.
     

    Attached Files:

  35. sergiobd

    sergiobd

    Joined:
    May 14, 2014
    Posts:
    18
    I'm using textures data containers for particle positions and colors. I've used this succesfully for generating particles from 3D models, static images, and pointcloud clips. Basically, my strategy is using a compute shader and rendering to a RenderTexture (following some of Keijiro's work).
    However, as RenderTextures are not persistent, I have to generate the data every time I run my app in the Start function. This slows down a lot the start-up of my scenes.
    Any ideas on how to make this data persistent? (note that I cannot save the RenderTexture using the ReadPixels strategy as ReadPixels does not support ARGBFloat data as far as I know.)
     
  36. JackDDeane

    JackDDeane

    Joined:
    Aug 6, 2018
    Posts:
    19
    I'd be interested in knowing how you generate your texture data from 3D models and how you pass that through the compute shader?
     
  37. MrSmokes

    MrSmokes

    Joined:
    Jun 13, 2017
    Posts:
    8
    is it possible to generate particles at specific positions fed from CSV file? or scripted into via array/list of transform positions?

    if not, that would be a great addition for scientific visualisation purposes.


    currently trying to visualise 12.5 million objects as part of a VR project. VFXgraph may be the key to my success, but i'm no VFX artist and very new to this stuff.
     
  38. sevelee

    sevelee

    Joined:
    Apr 5, 2017
    Posts:
    18
    I am trying to use VEG to make a smoke effect which basiclly using a smoke filpbook texture.
    I am following the tutorial of

    Using Lit Quad Output in my VEG.
    But it is green in some angle with light.
    It that a bug with Lit Quad Output?
    Using Unity 2019.1, HDRP 5.16.1, Visual Effect Graph 5.16.1
     

    Attached Files:

  39. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    161
    Not your fault - but because we've ended up with a crazy long forum thread rather than separate posts - similar questions have been answered a few times on earlier pages.

    Search for this thread for "keijiro" as most answers have referenced his work on this topic (for example https://github.com/keijiro/Rsvfx )
     
  40. sergiobd

    sergiobd

    Joined:
    May 14, 2014
    Posts:
    18
    Sure. Basically, I'm passing the array of vertices to a ComputeShader as a ComputeBuffer, and rendering it to a Texture. It turned out to be less scary than I thought. Note that a point is generated in each vertex, which means if you have a LowPoly model, for example, you would need to use some strategy to interpolate points inside a face if you need more point density.
    For the moment, I haven't done anything interesting with this. This is a quick script. Use it at your own risk.


    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4.  
    5. #if UNITY_EDITOR
    6. using UnityEditor;
    7. #endif
    8.  
    9. public class ModelBaker : MonoBehaviour
    10. {
    11.  
    12.     ComputeBuffer vertexBuffer;
    13.  
    14.     ComputeBuffer colorBuffer;
    15.  
    16.     RenderTexture inputPositionTexture;
    17.  
    18.     public RenderTexture VFXPositionMap;
    19.  
    20.     public RenderTexture VFXColormap;
    21.  
    22.     int texSize;
    23.  
    24.     // Compute shaders
    25.     public ComputeShader vertexBaker;
    26.  
    27.     public ComputeShader colorBaker;
    28.  
    29.     public string renderTexturePath;
    30.  
    31.     // Mesh data
    32.  
    33.     int numPoints;
    34.  
    35.     Mesh mesh;
    36.  
    37.     void Start()
    38.     {
    39.  
    40.         //renderer = GetComponent<MeshRenderer>();
    41.  
    42.     }
    43.  
    44.     // Tener cuidado con esto:  El render texture se tiene que crear en Runtime, y CreateAsset solo está en modo Editor.
    45.     public void BakeVertices() {
    46.  
    47.         // VFX Texture
    48.  
    49.         if (VFXPositionMap == null)
    50.         {
    51.  
    52.             VFXPositionMap = new RenderTexture(texSize, texSize, 0, RenderTextureFormat.ARGBFloat);
    53.  
    54. #if UNITY_EDITOR
    55.  
    56.             Debug.Log("Creating texture");
    57.             VFXPositionMap = new RenderTexture(texSize, texSize, 0, RenderTextureFormat.ARGBFloat); // Square texture
    58.             AssetDatabase.CreateAsset(VFXPositionMap, renderTexturePath + "/RT_Vert_" + gameObject.name + ".asset");
    59. #else
    60.             Debug.LogError("VFX Render Texture should exist");
    61. #endif
    62.         }
    63.         else if (VFXPositionMap.width != texSize || VFXPositionMap.height != texSize)
    64.         {
    65.  
    66.             //Create again
    67.  
    68.         }
    69.  
    70.  
    71.         // Compute
    72.         vertexBuffer = new ComputeBuffer(numPoints, 3 * sizeof(float));
    73.  
    74.         vertexBuffer.SetData(mesh.vertices);
    75.  
    76.         vertexBaker.SetInt("dim", texSize);
    77.  
    78.         vertexBaker.SetTexture(0, "PositionTexture", inputPositionTexture);
    79.  
    80.         vertexBaker.SetBuffer(0, "PositionBuffer", vertexBuffer);
    81.  
    82.         vertexBaker.Dispatch(0, (texSize / 8) + 1, (texSize / 8) + 1, 1);
    83.  
    84.         Graphics.CopyTexture(inputPositionTexture, VFXPositionMap);
    85.  
    86.         vertexBuffer.Dispose();
    87.  
    88.  
    89.     }
    90.  
    91.     public void BakeColors() {
    92.  
    93.  
    94.         // COlor texture
    95.  
    96.         if (VFXColormap == null)
    97.         {
    98.  
    99.             VFXColormap = new RenderTexture(texSize, texSize, 0, RenderTextureFormat.ARGBFloat);
    100.  
    101. #if UNITY_EDITOR
    102.  
    103.             Debug.Log("Creating texture");
    104.             VFXColormap = new RenderTexture(texSize, texSize, 0, RenderTextureFormat.ARGBFloat); // Square texture
    105.             AssetDatabase.CreateAsset(VFXColormap, renderTexturePath + "/RT_Col_" + gameObject.name + ".asset");
    106. #else
    107.             Debug.LogError("VFX Render Texture should exist");
    108. #endif
    109.         }
    110.         else if (VFXColormap.width != texSize || VFXColormap.height != texSize)
    111.         {
    112.  
    113.             //Create again
    114.  
    115.         }
    116.  
    117.        
    118.         colorBuffer = new ComputeBuffer(numPoints, 4 * sizeof(float) );
    119.  
    120.         colorBuffer.SetData(mesh.colors);
    121.  
    122.         colorBaker.SetInt("dim", texSize);
    123.  
    124.         colorBaker.SetTexture(0, "ColorTexture", inputPositionTexture);
    125.  
    126.         colorBaker.SetBuffer(0, "ColorBuffer", colorBuffer);
    127.  
    128.         colorBaker.Dispatch(0, (texSize / 8) + 1, (texSize / 8) + 1, 1);
    129.  
    130.         Graphics.CopyTexture(inputPositionTexture, VFXColormap);
    131.  
    132.         colorBuffer.Dispose();
    133.  
    134.  
    135.  
    136.     }
    137.  
    138.  
    139.     public void Bake() {
    140.  
    141.         Initialize();
    142.  
    143.         BakeVertices();
    144.  
    145.         BakeColors();
    146.  
    147.  
    148.  
    149.     }
    150.  
    151.     public void Initialize() {
    152.  
    153.  
    154.         mesh = GetComponent<MeshFilter>().mesh;
    155.  
    156.         if (mesh == null)
    157.         {
    158.  
    159.             Debug.LogError(" No mesh or mesh filter present");
    160.  
    161.             return;
    162.  
    163.         }
    164.  
    165.         numPoints = mesh.vertexCount;
    166.  
    167.         texSize = Mathf.CeilToInt(Mathf.Sqrt(numPoints));
    168.  
    169.         Debug.Log(" num points " + numPoints + " texsize " + texSize);
    170.  
    171.         // Input position tex:
    172.  
    173.  
    174.         inputPositionTexture = new RenderTexture(texSize, texSize, 0, RenderTextureFormat.ARGBFloat);
    175.  
    176.         inputPositionTexture.enableRandomWrite = true;
    177.  
    178.         inputPositionTexture.Create();
    179.  
    180.     }
    181.  
    182.     // Update is called once per frame
    183.     void Update()
    184.     {
    185.        
    186.     }
    187. }
    188.  
    Code (CSharp):
    1. // Each #kernel tells which function to compile; you can have many kernels
    2. #pragma kernel CSMain
    3.  
    4. // Create a RenderTexture with enableRandomWrite flag and set it
    5. // with cs.SetTexture
    6. RWTexture2D<float4> PositionTexture;
    7.  
    8. uint dim;
    9.  
    10. Buffer<float3> PositionBuffer;
    11.  
    12. [numthreads(8,8,1)]
    13. void CSMain (uint3 id : SV_DispatchThreadID)
    14. {
    15.  
    16.     int index = id.y * dim + id.x;
    17.  
    18.     float lastIndex = (dim - 1) * (dim - 1);
    19.    
    20.     // Trick for generating a pseudo-random number.
    21.     // Inspired by a similar trick in Keijiro's PCX repo (BakedPointCloud.cs).
    22.     // The points that are in excess because of the square texture, point randomly to a point in the texture.
    23.     // e.g. if (index > lastIndex) index = 0 generates excessive particles in the first position, resulting in a visible artifact.
    24.    
    25.     //if (index > lastIndex) index = ( index * 132049U ) % lastIndex; // Ended up not using this.
    26.    
    27.     float3 pos;
    28.  
    29.     if (index > lastIndex) {
    30.    
    31.         pos = 0;
    32.     }
    33.     else {
    34.  
    35.         pos = pos = PositionBuffer[index];
    36.  
    37.     }
    38.  
    39.     PositionTexture[id.xy] = float4 (pos.x, pos.y, pos.z, 1);
    40. }
    41.  
    The Compute shader for the vertices (The colors
     
  41. sergiobd

    sergiobd

    Joined:
    May 14, 2014
    Posts:
    18
    It should be possible. In general you`d have to encode the data in a texture. For doing that, you should pass your data list to a ComputeShader, and render it (one data point per pixel). The texture is going to be a bit big, though... and I dont know if the VFX Graph has a limit on the size of the input textures...
     
  42. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,298
    Maybe we can help each other with this.

    I have a compute shader based fluid simulation, that stores all its data in a whole bunch of 3D render textures. I normally use it live in realtime at runtime, but months ago I decided to add a feature where I can press a key while running the sim in editor run mode, and it will bake the velocity rendertexture from that moment in time into a 3D texture asset. I got something that worked, but I have forgotten all of the research I did and what I learnt and what code examples I took as starting points.I only got as far as getting something that produced a result that looked somewhat correct, I did not properly test that the baked data was perfect, and some things could be completely backwards for all I know. And my main thought after getting something working was 'oh my word these texture asset files have really, really huge filesizes'. I havent been back to this side of my system since, so my knowledge comes to an end at that moment.

    The format of the velocity rendertexture in question is ARGBFloat. This is mostly why I am thinking perhaps my thing can help you. The code below wont quite work as is because it refers to some variables from elsewhere in my sim, such as the velocityTextures[READ] rendertexture from my sim, BakeName, m_width, m_height, m_depth but I think it should still be enough to understand this method. Its basically using asynchronous GPU readback requests to get buffers of data from a rendertexture. I create a new Texture3D, and an array of Colors from/for it, then loop through the buffer and stick data from it into the Color array, before using SetPixels to write those colors back to the texture and then save the texture as an asset.

    Who knows how many mistakes I made in the code! If you spot any then please do let me know, and also if you do the bit that I didnt get to, researching methods to make the asset filesizes much more reasonable, I would be glad to hear about it, thanks :)

    Code (csharp):
    1.  
    2.  
    3. Queue<AsyncGPUReadbackRequest> _requests = new Queue<AsyncGPUReadbackRequest>();
    4.  
    5. And then I had the following code in my update loop:
    6.  
    7.   while (_requests.Count > 0)
    8.         {
    9.             var req = _requests.Peek();
    10.  
    11.             if (req.hasError)
    12.             {
    13.                 Debug.Log("GPU readback error detected.");
    14.                 _requests.Dequeue();
    15.             }
    16.             else if (req.done)
    17.             {
    18.                 Texture3D _texture = new Texture3D(m_width, m_height, m_depth, TextureFormat.RGBAFloat, true);
    19.                 Color[] theColourData = _texture.GetPixels();
    20.                 //Debug.Log("Colour array length:" + theColourData.Length);
    21.                 for (int k = 0; k < m_depth; k++)
    22.                 {
    23.                     var buffer = req.GetData<Color>(k);
    24.                     //Debug.Log("Buffer Length: " + buffer.Length);
    25.                     Color[] layerPixels = buffer.ToArray();
    26.                     for (int i = 0; i < m_height; i++)
    27.                         for (int j = 0; j < m_width; j++)
    28.                         {
    29.                             theColourData[j + i * m_width + k * m_height * m_width] = layerPixels[j + i * m_width];
    30.                         }
    31.                 }
    32.                 Debug.Log("GPU readback complete - saving texture3D now.");
    33.                 //Debug.Log(buffer.Length);
    34.            
    35.                 _texture.name = "capturetest";
    36.                 _texture.filterMode = FilterMode.Bilinear;
    37.                 _texture.wrapMode = TextureWrapMode.Clamp;
    38.                 _texture.SetPixels(theColourData);
    39.                 _texture.Apply();
    40.                 string dateTimeStamp = System.DateTime.Now.ToString("dd-MM-yy-HH-mm-ss");
    41.                 string path = "Assets/VectorFields/" + BakeName + "-" + dateTimeStamp + ".asset";
    42. #if UNITY_EDITOR
    43.                 AssetDatabase.CreateAsset(_texture, path);
    44. #endif
    45.  
    46.                 _requests.Dequeue();
    47.             }
    48.             else
    49.             {
    50.                 break;
    51.             }
    52.         }
    53.  
    54.         if (Input.GetKeyDown(KeyCode.C))
    55.         {
    56.             if (_requests.Count < 8)
    57.                 _requests.Enqueue(AsyncGPUReadback.Request(velocityTextures[READ]));
    58.             else
    59.                 Debug.Log("Too many GPU readback requests.");
    60.         }
     
  43. DuvE

    DuvE

    Joined:
    May 22, 2016
    Posts:
    109
    Here are my questions:

    1.

    I still don't quite get it how to change spaces inside VEG, I want to simply get a world space 0,0,0 with Local VEG, this node setup is working, but really weird, sometimes when I just save my scene the position is displaced.



    2.

    I also have questions about "Trigger Event Always" block



    Is there some trick to make it trigger an exact number of times per second, but not every frame? Also, the whole Update block behaves like void Update or like void FixedUpdate?

    3.

    And one more question: Are there only two blocks for particle kill? I found only kill sphere and kill box, but what if I want to kill particles based on some noise mask, if noise value is greater then 1, the particle will be killed.

    4.

    "Trigger Even On Die" for some reason keep triggering from particles with infinite lifetime (I just didn't set the lifetime of them, so I assume it's infinite), but if I, for example, set the lifetime to 9999, this trigger works properly.
     
    Last edited: Jul 2, 2019
  44. sergiobd

    sergiobd

    Joined:
    May 14, 2014
    Posts:
    18

    Whoa! Thank you elbows! Will definitely give it a try!
     
  45. VladVNeykov

    VladVNeykov

    Unity Technologies

    Joined:
    Sep 16, 2016
    Posts:
    39
    Hey @andybak , unless I am drawing a complete blank, you might have met one of my colleagues? :)
    Edit: @andybak I was drawing a blank, the event was way back in April! Yes, loved your VR project! :D Yes, it was totally me, good putting a face to the forum name! :)

    To your question, it depends when you Set Alive to false. In Initialize, you'll kill off the particles right away, while in Output you still pay the simulation cost, but are just disabling the renderer (the particle are still "alive").

    Try assigning your render texture to an exposed Texture2D parameter, not to a camera parameter?

    Set the initial shape of your particles in Initialize, then either use compare or remap to get the timing right and change the relevant values in your conform to SDF block:

    The result:
     

    Attached Files:

    Last edited: Jul 4, 2019
  46. VladVNeykov

    VladVNeykov

    Unity Technologies

    Joined:
    Sep 16, 2016
    Posts:
    39
    The exposure weight slider has already been implemented for the VFX Graph and will be coming in the next package versions (I believe 6.8.0 and later, with some of the fixes backported to 5.x). You can also get the latest files directly from github.
     
  47. DuvE

    DuvE

    Joined:
    May 22, 2016
    Posts:
    109
    In addition to my previous questions:

    5.

    Any plans to add some kind of variables to store for example result of the noise in it, just to make everything look clear, without this spider net of lines and connections?

    6.

    In VEG Samples one trick is actively used, the trick is to store some data in parameters you don't need for current VFX. For example, the initial position of particles, where they were born can be stored in TargetPosition then used in some calculation. I've run into an issue, where I already used all these unused parameters, so my question is, will you add some custom ones? Like Custom1, Custom2, for storing data only in them. Also, in VEG Samples in HoloTable I've found custom attribute "Base Position" but I can't create it even in that particular VEG, Experimental Operators are turner On, here is a screenshot https://prnt.sc/oamb1k

    7.

    Is it possible to connect two separate GPU Events into one same Initialization block? When I'm trying to do this, Unity just crashes. Also, if I connect GPU Event into Spawn block instead of Initialization block, it just doesn't work, is it ok?
     
  48. VladVNeykov

    VladVNeykov

    Unity Technologies

    Joined:
    Sep 16, 2016
    Posts:
    39
    1. Try using the Change Space (Position) node

    2. We have added a Trigger Event Rate which can trigger based on frame-independent time or distance, should be coming out in the next package version (and is already available on the public github). And to the second point, you can select the VFX asset in the project folder and then in the inspector change the Update mode:

    (The fixed time step and max delta time settings can be adjusted in Edit - Project Settings - VFX)

    3. You can always add a Set Alive block and set the particles you want to kill off to False. In your example, something like this should work:

    (if you use Set Alive in Output, you can also toggle the particles on/off if you change the alive conditions)

    4. If I remember correctly, if the lifetime is not set it defaults to 0, so the on die will always trigger. It's a philosophical question as to what ought to happen upon the death of an immortal particle.

    5. We do already support custom attributes if you'd like to store some custom per-particle value. As for the spider net of lines and connections, we recently also added subgraphs so that should help make graphs neater.

    6. Same as 5, we do already have custom attributes. Here's an example of how to use them:


    7.
    You shouldn't be able to connect more than one GPU event to the same Initialize (just tried it, and it automatically disconnects the other GPU event). Maybe it's some old bug? You can have multiple GPU events be triggered by the same trigger. And correct, connecting Spawners to GPU events is not supported.
     

    Attached Files:

    DuvE likes this.
  49. coldpizzapunk

    coldpizzapunk

    Joined:
    Aug 20, 2014
    Posts:
    22
    When I was trying out the custom attributes they worked great, one thing I was wondering/hoping to do with them is to send their stored information along to a Triggered Event Particle. I noticed it works with the non-custom attributes, but it didn't seem to be working with any of my custom attributes. Is this possible to do?
     
  50. garryjnewman

    garryjnewman

    Joined:
    Sep 11, 2015
    Posts:
    25
    Is there a way to make the particles to add/write distortion/refraction in hdrp?