Search Unity

Feedback Wanted: Visual Effect Graph

Discussion in 'Graphics Experimental Previews' started by ThomasVFX, Oct 21, 2018.

  1. Voronoi

    Voronoi

    Joined:
    Jul 2, 2012
    Posts:
    283
    I have been playing around with this and the Realsense D415 camers and have found it very cool! The realsense seems more stable in 2018.3 than it was in 2018.2, which is nice.

    So far, I've got a simple version of what I want to happen working. I want particles to be emitted in the shape of the user that is X distance from the camera. My problem, and I'm wondering if anyone else has solved this, is how to generate a full greyscale image from the depth stream of the camera.

    @JulienF_Unity screen grab was very helpful, and it sounds like @racarate and @jashan are both doing similar things.

    Here's what I have so far:
    Screen Shot 2018-12-06 at 10.07.48 AM.png

    The particles emit based on the raw depth stream. The problem is, the image is not a full depth image, just black and dark red. I can see that the librealsense examples use shaders to parse out the depth into color maps, but I can't figure out how to get those color maps as an input to the particle system. Works fine with the raw image and this function:

    Code (CSharp):
    1.    public void SendToFXThing(Texture tex)
    2.     {
    3.         thisVFX.SetTexture("CamTexture", (Texture2D)tex);
    4.     }
    I simply add a new binding in the depth sample. I do not know OpenGL well enough to figure out how to parse the data myself, and I'm also concerned that since I'm on a Mac I would rather be using Metal with this new VFX system.

    Anyone here have some experience in converting a the Z16 depth stream to a greyscale depth image?
     
    Franckitou likes this.
  2. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,105
    Just came here to say that VFX Graph now finally works in 2019.1a11. That is very cool!

    I got a little stuck with things not related to VFX Graph (my main project is still on 2017.4, and now I need to make these two things work together, plus upgrade to the new SteamVR Input system which apparently is even more of a nightmare than upgrading Unity ;-) ).

    Most likely I'll have to use VFX Graph with "vanilla 3D" (aka legacy render-pipeline) ... aside of upgrading a legacy project to HDRP, the other issue is that HDRP gives me around 2.5ms per frame overhead that I'd rather consume with more particles.
     
    vjrybyk likes this.
  3. GameDevCouple_I

    GameDevCouple_I

    Joined:
    Oct 5, 2013
    Posts:
    2,202
    VFX graph wont work with legacy pipeline.
     
  4. ShakingEarthDigital

    ShakingEarthDigital

    Joined:
    Mar 20, 2014
    Posts:
    3
    I'm not able to get VFX Graph working in 2019.1.0a11.
    This is the only error I see:

    Code (CSharp):
    1. Shader error in 'Hidden/VFX/System 1/Quad Output': 'EvaluateAtmosphericScattering': no matching 2 parameter function at /git/jens/vfxgraphtest/vfxTest2019.1.0a11/Library/PackageCache/com.unity.visualeffectgraph@5.2.1-preview/Shaders/RenderPipeline/HDRP/VFXCommon.cginc(80) (on d3d11)
    My setup is a simple as I could make it: New project with HDRP. Add VFX Graph from Package Manager (5.2.1).
    Switch Scripting Runtime Version to .NET 4.x Equivalent in Player settings to stop most of the console errors.

    Create a new vfx graph object. The error shows in the console and on the shader object when you expand the vfx object.

    Help?
     
  5. Dark-Table

    Dark-Table

    Joined:
    Nov 25, 2008
    Posts:
    262
    I've been experimenting with something similar. I've been using OpenCV (the Enox version from the Asset Store) to do threshold operations on the depth texture. Unfortunately it's all happening on the CPU so it's kinda slow, but there are tons of useful tools to fix up the depth texture. I've been using Photo.inpaint to fill holes in the depth texture, then Imgproc.threshold to convert it to black and white.

    I've been looking at faster ways to do this and I just found Graphics.Blit this lets you supply a texture and a material and render the result into a destination RenderTexture. I haven't had a chance to try it, but it could be a fast way to do some of these operations.

    Another thing to be aware of is the Z16 depth texture you get from the camera contains integers that are represent how many millimeters that pixel is from the camera. The range is technically 0-65535, but realistically 200-10000. Unity is expecting depth textures to contain floating point values between 0-1. I haven't tried it yet, but I think that's going to need to be part of the conversion if you want to use the depth texture block that's available in VFXGraph.

    p.s. For stability, I've been setting the RsDevice object in the scene to use "Unity Thread" while in the editor. Multithread should work fine for standalone builds though.
     
  6. Voronoi

    Voronoi

    Joined:
    Jul 2, 2012
    Posts:
    283
    Thanks! I think Graphics.Blit might be what I can use. I actually did this in another project using an extra camera, but it seemed so hacky I'd like to find a simpler way. The performance was actually pretty good.

    The technique I used is in my project on Github. The webcam texture is shown on a polygon in the scene, a camera is set to look at the polygon and render to a rendertexture. The rendertexture is set to read/write and then fed into another script to convert it to a tex. The nice thing about this particular technique is downsampling is 'free' by setting the resolution of the rendertexture. For blob tracking I typically don't need all those pixels and for that project it worked well. So, the camera looking at the webcam feed can render at 90X40 pixels and I don't need to care what resolution the webcam's native feed is.

    I'm going to try to use Graphics.Blit directly from the materials in the realsense project. They are doing threshold, color conversion in a shader and have feathering and other things exposed as parameters that effect the shader. I'm just not sure if I can grab the texture after the shader does it's work using Graphics.Blit. Hence, the hacky solution with a camera might be how I do this again...
     
  7. Dark-Table

    Dark-Table

    Joined:
    Nov 25, 2008
    Posts:
    262
    I made a first attempt using Graphics.Blit yesterday and gave up and used the camera pointing at a quad technique instead. *shrug* Good luck!

    You can use a RenderTexture directly as the Texture in the VFXGraph's Sample Texture 2D node, BTW.
     
  8. Voronoi

    Voronoi

    Joined:
    Jul 2, 2012
    Posts:
    283
    Thanks! Did not know that.

    It took me a while to figure this out, but if anyone else is interested, here's a screen grab of things working as I wanted it to. I made a second orthographic camera, changed the UI to World Space and positioned the camera to look at the UI texture. It does render the texture after shading, so I can now filter particles based on color or depth using the shaders provided in the realsense Unity project. Turned out I didn't need to write any code, just drag the Render Texture into the VFX node.

    Screen Shot 2018-12-08 at 11.03.34 AM.png
     
  9. Voronoi

    Voronoi

    Joined:
    Jul 2, 2012
    Posts:
    283
    Just found an even nicer way to use the depth stream. I'm not even sure how this works exactly, but the camera that is rendering to a RenderTexture is pointing at the RawImage of a UI object connected to the rsDevice Depth stream. As you can see, the UI is a flat image in the editor.

    But...once wired up to the Project On Depth block, the depth values are coming through to generate particles in 3d space! It's pretty nice! You can tweak the Project On Depth to get exactly the foreground object you want. It's a little hard to see in the screenshot, but my hand is closer to the camera than my body.

    Screen Shot 2018-12-08 at 12.54.18 PM.png

    This is exactly what I was hoping to do! By tweaking the Depth X,Y on the block, I can isolate only my hand as an emitter of particles.
     
    Last edited: Dec 8, 2018
    zhaoyue_liu and PixelPup like this.
  10. Voronoi

    Voronoi

    Joined:
    Jul 2, 2012
    Posts:
    283
    How would I go about saving my custom settings in a Visual Effect asset? I've exposed a number of inputs and when I change them during Play they are not saved. I was going to make a script that points to the Visual Effect Asset but I can't see how to access it.

    Also, since the asset is in the Project area, i would think it would be like a PostProcessing effect and save the settings during play mode.
     
    PixelPup likes this.
  11. ShakingEarthDigital

    ShakingEarthDigital

    Joined:
    Mar 20, 2014
    Posts:
    3
    I was able to get VFX Graph working in 2019.1.0a11 by updating HDRP to version 5.2.2!
     
  12. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,105
    Cool - I found this (after searching for a little while). So that's Project Settings / VFX / Render Pipe Settings Path.

    Unfortunately, both in Unity 2018.3.0b12 and Unity 2019.1.0a11, I get:

    Code (CSharp):
    1. Shader error in 'Hidden/VFX/System 1/Quad Output': invalid subscript 'xd' at /GameDev/Experiments/LegacyVFX/Library/PackageCache/com.unity.visualeffectgraph@4.6.0-preview/Shaders/RenderPipeline/Legacy/VFXCommon.cginc(49) (on d3d11)
    2.  
    3. Compiling Vertex program with UNITY_PASS_FORWARDBASE
    4. Platform defines: UNITY_ENABLE_REFLECTION_BUFFERS UNITY_USE_DITHER_MASK_FOR_ALPHABLENDED_SHADOWS UNITY_PBS_USE_BRDF1 UNITY_SPECCUBE_BOX_PROJECTION UNITY_SPECCUBE_BLENDING UNITY_ENABLE_DETAIL_NORMALMAP SHADER_API_DESKTOP UNITY_COLORSPACE_GAMMA UNITY_LIGHT_PROBE_PROXY_VOLUME UNITY_LIGHTMAP_FULL_HDR
    Is there a way to fix this? Turns out that for the time being, we can't move our project to HDRP, so Legacy is what we need to have working.

    Full code that's causing the shader compilation error:

    Code (CSharp):
    1. float VFXSampleDepth(float4 posSS)
    2. {
    3.     return _CameraDepthTexture.Load(int3(posSS.xd, 0)).r;
    4. }
    5.  
     
    Last edited: Dec 12, 2018
  13. Dark-Table

    Dark-Table

    Joined:
    Nov 25, 2008
    Posts:
    262
    I tried using Graphics.Blit again and it seems my problem might have been that the material supplied as the third parameter can't be using a ShaderGraph shader (HDRP?). My shader was simple so I rewrote it as an unlit shader and the blit works. This seems like a bug, if there are Unity people checking this thread I'm on 2018.3f1.
     
  14. z_space

    z_space

    Joined:
    Jul 17, 2016
    Posts:
    10
    I'm looking for how to get the direction (like a forward vector) of a particle. Anyone know how? I've tried using the "direction" node and the "axis" node, but neither seem to work. Any help would be appreciated.
     
  15. AnException

    AnException

    Joined:
    Apr 3, 2016
    Posts:
    3
    Was anyone able to get it working with DirectX 12?

    Thanks!
     
  16. RSSGuy

    RSSGuy

    Joined:
    Dec 6, 2018
    Posts:
    1
    How do we "lerp" between point caches?
    Like what this gentleman did.


     
  17. ErbGameArt

    ErbGameArt

    Joined:
    Apr 14, 2017
    Posts:
    4
    I made it 2 week ago. Video in Russian, but there are subtitles.
     
  18. ArtR

    ArtR

    Joined:
    Sep 27, 2011
    Posts:
    47
  19. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,105
    ... so, while the Legacy render pipeline template is there, it's currently broken. I have just filed this to FogBugz:

    (Case 1110700) Visual Effects Graph 5.2.3 has Legacy RenderPipeline Setting - but creates Shaders with Compilation Errors

    Is there maybe a workaround so we can use Visual Effects Graph with the Legacy render pipeline?
     
    Gruguir and vjrybyk like this.
  20. Private

    Private

    Joined:
    Jul 30, 2014
    Posts:
    6
    Are there any way to rotate spawn position(for box, cone, sphere (chunk of it) etc.) inside the graph
     
  21. ThomasVFX

    ThomasVFX

    Unity Technologies

    Joined:
    Jan 14, 2016
    Posts:
    26
    Right now, we don't do any transform into the shapes blocks, so a solution is to use a set position block (after the position shape block), and use a transform position operator taking a Get Position attribute operator as input.
     
  22. pointcache

    pointcache

    Joined:
    Sep 22, 2012
    Posts:
    531
    How do you guys make anything in vfx graph when it bugs out all the time?
    2018-12-20_21-17-53.gif
    Ive had sliders dissapearing, blocks just getting stacked on top of one another, ui partially or fully corrupting.
    Mind you i just started using in 5 minutes ago and already encountered 10 bugs.
    upload_2018-12-20_21-44-47.png
     
    Last edited: Dec 20, 2018
  23. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,105
    Those bugs might be new - last time I checked, it was fairly stable (it did sometimes stop showing the connections, which was easy enough to fix by restarting Unity). I did notice some Editor UI issues in the package manager in Unity 2019.1.0a12 (the most recent alpha). If what you see is related to those, resizing the panel is a quick workaround (in the package manager, it restored those issues reliably).
     
  24. Reticulatas

    Reticulatas

    Joined:
    Jul 31, 2012
    Posts:
    11
    Ditto, I have the same problems @pointcache does.

    Along with this one:
    upload_2018-12-20_20-55-14.png

    Almost looks like that artifacting you get when your GPU is about to go.

    EDIT: This only happens if a point cache is connected to the input of the "From Map" node. Otherwise it's fine.

     
    Last edited: Dec 21, 2018
  25. BaddestGameMaker

    BaddestGameMaker

    Joined:
    May 5, 2018
    Posts:
    1
    Any chance we might see Translucent Particle Shadows in the next update, i love the fact that particles can be lit with light sources and shadows, but i have been trying to get soft shadows for smoke particles for a very long time. something like present in unreal.
    upload_2018-12-21_15-15-42.png
     
  26. steego

    steego

    Joined:
    Jul 15, 2010
    Posts:
    911
    Is there a way to control the seed for the random node? I'm using it for some random positions of particles, and would like them to have the same position every time the effect is played. Being able to set the master seed per graph I think would solve my problem.
     
  27. vjrybyk

    vjrybyk

    Joined:
    Nov 19, 2015
    Posts:
    3
    This would be really amazing, even if only for unlit Outputs or points.. VEG is an unbelievable match for VR, and appears robust enough to be considered for production...

    Would it be unthinkable, for example, to expose the graph results as compute buffers via a special output node?
     
    jashan likes this.
  28. PixelPup

    PixelPup

    Joined:
    Mar 6, 2018
    Posts:
    18
    I cannot seem to use set lifetime from curve. Using 2018.3, it throws a shader error. Has anyone else had this happen?
     
  29. PixelPup

    PixelPup

    Joined:
    Mar 6, 2018
    Posts:
    18
    This is somewhat solved by using a random number for set lifetime which I think in some ways makes more sense as the graph I am not sure what the x and y axis would actually be?
     
  30. Natey

    Natey

    Joined:
    Oct 10, 2014
    Posts:
    14
    Can I get a quick example of how the line output is actually meant to be used? Like for example, in the demos with long sweeping curved lines?
     
  31. Franckitou

    Franckitou

    Joined:
    Jun 3, 2015
    Posts:
    10
    Hi I downloaded the last version of VFX and I have a problem, no particules appear in the unity viewport. unity 2018.3.0f2

    upload_2018-12-25_13-57-16.png
     
    Last edited: Dec 25, 2018
  32. Natey

    Natey

    Joined:
    Oct 10, 2014
    Posts:
    14
    Franckitou
    Try creating a new project with the HDRP Template. :)
     
    Franckitou likes this.
  33. keeponshading

    keeponshading

    Joined:
    Sep 6, 2018
    Posts:
    411
  34. Gruguir

    Gruguir

    Joined:
    Nov 30, 2010
    Posts:
    316
    Any news regarding issue with legacy/standard pipeline ? I hope it will be fixed.
     
    jashan and vjrybyk like this.
  35. thinksquirrel_lily

    thinksquirrel_lily

    Joined:
    Feb 8, 2011
    Posts:
    1,177
    Hi folks,

    Fluvio (fluid simulation) developer here. We're currently working on applying realtime fluid sim to the VFX graph. Is there any way for us to store neighboring particle information in a buffer, then access that data in a later step?

    Use case: We're currently looking at getting particle neighbors. We currently do the following steps in our shader (non-VFX-graph) implementation. Each step happens as sequential parallelized steps, on a per-particle basis.

    1) perform spatial partitioning of particles, stored in an infinite grid (GPU-optimized layout of a flat array, won't go into details here)
    2) perform a neighbor search using the infinite grid, store X closest neighbors for particle Y in an flat array sized X * Y
    3) perform physical simulation on each particle Y, reading information from each neighbor and writing a force vector for particle Y.
    4) integrate forces into velocities

    So in our case, as a bare minimum we need three buffers:

    1) spatial partitioning grid: arbitrary (user-configurable) size * max neighbor count
    2) neighbor buffer: particle count * max neighbor count (this one is generally pretty large but very doable w/ current GPUs)
    3) force buffer: same size as our particle count for accumulating forces

    Note - we're fine digging into the internals of blocks if this isn't possible with multiple blocks at this time. Right now it's the only blocker we see for adding full fluid sim to the VFX graph. So even if not possible with the current API or blocks, I'm happy to hear of any hacky ways we could experiment with this or explore.

    EDIT: Thought through this and rephrased the question to better fit what we're doing.
     
    Last edited: Dec 29, 2018
  36. thinksquirrel_lily

    thinksquirrel_lily

    Joined:
    Feb 8, 2011
    Posts:
    1,177
    So I found out how to read from other particles. While I think this is undocumented ;), looks like
    attributeBuffer
    , and all of its data, is accessible in a custom block. Unfortunately there seems to be no way (without patching generated shaders) to add additional buffers, but I might be able to work around it with textures instead.

    Some requests, based on what I've found so far:

    1) Are there plans to make custom blocks not internal, so that users can implement more complex behaviors that would take dozens or hundreds of nodes/operators? Right now I'm working around this by patching in a file adding a friend assembly once the package installs. This is simple enough, and I know everything's early, but I'm hoping exposing this API is on the radar—one of the things we really struggled with in the old particle system was the fully sealed-off UI.

    2) Unless I missed this, I'd love to be able to add other custom buffers (not just textures) for use within blocks or as inputs. EDIT: I just realized that blocks support includes, and experimental custom attributes exist. This should work fine!

    3) I would really like the ability to override the default Euler integration. From my scan, it seems this is an invisible block added. We use Euler, Verlet, and some other obscure integration methods for simulation. EDIT: I see this is configurable ("none" can be chosen in the inspector for the visual effect)

    4) I noticed a
    mass
    parameter in the generated shader code, always set to 1. Is that editable at all? It seems it's used with velocity field forces. For consistency purposes, I'd love to be able to set that mass value to the same mass we use for fluid simulation.

    Digging through the implementation, this is some great stuff! The way the graph compiles down to clean shader code is top-notch, and it really shows how far Unity's HLSL compute pipeline has come :)
     
    Last edited: Dec 31, 2018
  37. VladVNeykov

    VladVNeykov

    Unity Technologies

    Joined:
    Sep 16, 2016
    Posts:
    40
    Hi Gruguir,
    The VFX Graph will not be supported on Unity's built-in render pipeline. Currently it's targeting specifically HDRP, with lightweight pipeline support for compute-capable devices in the works, and eventually (when CPU-particles are introduced), full lightweight render pipeline support down the road. We'd also aim at supporting custom render pipelines.
     
    racarate likes this.
  38. GameDevCouple_I

    GameDevCouple_I

    Joined:
    Oct 5, 2013
    Posts:
    2,202
    Fixed? Its not broken. It has never been for built in, and that has been made clear from the get go, including in this thread.
     
  39. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,410
    Actually some confusion on this subject was probably caused by a number of posts on the first page of this thread.

    This being the most obvious example: #23
     
    vjrybyk, Gruguir and GameDevCouple_I like this.
  40. GameDevCouple_I

    GameDevCouple_I

    Joined:
    Oct 5, 2013
    Posts:
    2,202
    Ahh! This makes sense now, thanks!
     
  41. Gruguir

    Gruguir

    Joined:
    Nov 30, 2010
    Posts:
    316
    @VladVNeykov thanks for clarification. So we can definitely forget the legacy render path of the package. Meanwhile i fell back on the mesh gpu instancing option of the particle system, it works great.
     
  42. turboturboturbo

    turboturboturbo

    Joined:
    Dec 2, 2018
    Posts:
    17
    I'm trying to create a collider/attractor plane at the bottom of the viewable screen. Is this currently possible?
     
  43. Danua

    Danua

    Joined:
    Feb 20, 2015
    Posts:
    192
    Hello, does anybody know how to change sorting mode? Like at shuriken particle system? I've got wierd issue with smoke particle
     
  44. ThomasVFX

    ThomasVFX

    Unity Technologies

    Joined:
    Jan 14, 2016
    Posts:
    26
    Hello!

    Particle Sorting is set to automatic by default so it shall activate if you use an alpha blended mode such as AlphaBlend or Premultiplied Alpha Blending. However you can check in the Output context's inspector if the sorting is actually enabled.
    Also, please note that currently only camera depth sorting is available.

    upload_2019-1-8_10-30-5.png

    You can use a plane collider in update and expose a plane as a parameter, then use a VFX Plane Binder on the component to bind a Plane GameObject that would be attached to the camera. if you need to perform some gravity towards the plane, you can use the plane normal and negate it to get the gravity vector.

    upload_2019-1-8_10-32-51.png

    Right now as we are in preview all this code is left as internal but you can take a look at this repository if you still want to put your hands in the code and write blocks/operators/contexts: https://github.com/peeweek/net.peeweek.vfxgraph-extras

    However, the API is subject to breaking changes in 2019 so that's the main reason we did not make it public it atm.
    In the end it shall be, alongside other artist-friendly features such as subgraphs.


    Exactly, you can also disable aging and reaping of particles to perform your own age management.
    Also, disabling the automatic integration and using the block explicitly enables you to perform post-integration computations.

    The mass attribute defaults to one indeed to avoid disturbance in particle behaviors, however you can set it in an initialize via a regular setAttribute or for instance use a "Calculate mass from volume" block. If you need to bake per-particle mass (and hopefully use houdini) you can use a Set Mass from Map and use a point cache to bake values into textures.
     
    createtheimaginable likes this.
  45. JulienF_Unity

    JulienF_Unity

    Unity Technologies

    Joined:
    Dec 17, 2015
    Posts:
    91
    Some additional note about per particle sorting:

    As Thomas said, only sorting based on camera distance is available at the moment and automatically enabled by default with non commutative blend modes. We plan to add other types of sorting (based on age for instance) and also allow you to specify your own sorting key (via nodal interface) for a given system.

    Also there is currently a bug in the sorting shader that can causes the particles to flicker in some specific scenarios. @Danua This may explain the "weird issues" you're seeing and will be addressed eventually.
     
  46. thinksquirrel_lily

    thinksquirrel_lily

    Joined:
    Feb 8, 2011
    Posts:
    1,177
    Huge thanks for the link and the additional info @ThomasVFX!

    I actually got an initial release out here:
    https://github.com/thinksquirrel/fluviofx
    https://twitter.com/getfluvio/status/1082598344466333696

    Will definitely take a look into that repo- we had to hack together a few things to make our initial release work and I'd love to clean some of that up/keep in tandem with the API as it develops.

    I do have one bit of feedback though. I actually ran into a bit of a wall trying to implement spatial partitioning for particles. Basically, we need to be able to access a read/write buffer or texture _somewhere_ on the GPU in order to store grid positions and neighboring particles. Attributes haven't worked here because the sizes don't correlate to particle system capacity at all. Do you know of any way we could do this?

    Also, are particleIds and indexes guaranteed to be the same? If not, would love to have some way to access the current index in a block somehow. I wasn't sure if I missed that or not. We've gotten around this so far by patching the shader files after they are generated, but it's a pretty ugly workaround.
     
    id0 and JulienF_Unity like this.
  47. JulienF_Unity

    JulienF_Unity

    Unity Technologies

    Joined:
    Dec 17, 2015
    Posts:
    91
    @thinksquirrel_lily Very nice work! Glad to see some people digging deeper in the tool.

    First of all, index and id are two different things:
    • id is guaranteed to be unique per particle (Well except it will cycle on 32bits overflow). Basically it is incremented every time a particle is spawned
    • index is that actual index of the particle and is used to access particle data in the attribute buffer.
    Both Id and index are guaranteed not to change during the particle lifetime. Index can be reused right away by a newly born particle though.Alive particle are stored in a sparse way in the attribute buffer. Meaning alive particles can be spread in the whole buffer with dead particles. There is no compaction of any sort. This is to guarantee particle index remains constant throughout the entire particle life.

    About your questions, you're actually a bit too soon. All of this is planned in the future :). To give a little bit more info of steps that will be taken in the future (no ETA though)
    • Give access to the particle index as a read only attribute
    • Allows random read access (per index) of any particle attribute (probably not random write though to avoid race conditions)
    • We'll also add a built in space partionning pass to be able to later implement new behaviors based on that (flocking, particle vs particle collision, nbody sim, fluids...). But with random read of particle buffer you'll also be able to build your own partitionning structure if you want too.
    So hopefully you will be able to do all you want without hacks :)
     
  48. JulienF_Unity

    JulienF_Unity

    Unity Technologies

    Joined:
    Dec 17, 2015
    Posts:
    91
    @Gruguir

    Just to clarify things about legacy render pipe:

    We started initially the VFX graph using it as HDRP was not mature enough at that time. As we already have the implementation we thought it would be nice to share it to users. However it only works with unlit outputs and we wont update it with new features. Also we dont guarantee to maintain it as our focus is really on srp (hdrp at the moment and later also lwrp). So we provide the legacy implementation as-is and wont spend effort on it anymore. So we dont give guarantee it wont break or be completely removed in the future.

    That being said, there was a typo recently in an include shader that prevented vfx to be compiled. We have just fixed it and it will be in the next version (4.7.0) so that vfx graph (or at least a subset of it) should be usable in legacy (the fix is already on github in VFXCommon.cginc on the vfx/2018.1-backport branch.)
     
    vjrybyk, jashan and Gruguir like this.
  49. Gruguir

    Gruguir

    Joined:
    Nov 30, 2010
    Posts:
    316
    @JulienF_Unity that's how i understood it, anyway that's good to know that it is useable again at the moment.
     
    vjrybyk likes this.
  50. jpfjord

    jpfjord

    Joined:
    Dec 19, 2017
    Posts:
    2
    I feel like an idiot.....but is there an equivalent to Rotation Over Time\life? I assumed it was the "Set Angle over life" block but that doesn't do anything no matter how much I adjust the curve. I feel like this should be pretty simple and that I'm just missing something obvious.