Search Unity

Feedback Wanted: Visual Effect Graph

Discussion in 'Visual Effect Graph' started by ThomasVFX, Oct 21, 2018.

Thread Status:
Not open for further replies.
  1. Wonderment_by_Design

    Wonderment_by_Design

    Joined:
    Oct 23, 2017
    Posts:
    4
    Hi, I have the same error as this guy.

    I changed SettingsScope to SettingsScopes and now I get new errors and it's still not working :/

    The error I receive now is:

    NullReferenceException: Object reference not set to an instance of an object
    UnityEngine.Experimental.Rendering.GPUCopy..ctor (UnityEngine.ComputeShader shader) (at Packages/com.unity.render-pipelines.high-definition/Runtime/Core/CoreResources/GPUCopy.cs:16)
    UnityEngine.Experimental.Rendering.HDPipeline.HDRenderPipeline..ctor (UnityEngine.Experimental.Rendering.HDPipeline.HDRenderPipelineAsset asset) (at Packages/com.unity.render-pipelines.high-definition/Runtime/RenderPipeline/HDRenderPipeline.cs:243)
    UnityEngine.Experimental.Rendering.HDPipeline.HDRenderPipelineAsset.InternalCreatePipeline () (at Packages/com.unity.render-pipelines.high-definition/Runtime/RenderPipeline/HDRenderPipelineAsset.cs:23)
    UnityEngine.Experimental.Rendering.RenderPipelineAsset.CreatePipeline () (at C:/buildslave/unity/build/Runtime/Export/RenderPipeline/RenderPipelineAsset.cs:19)
    UnityEngine.Experimental.Rendering.RenderPipelineManager.PrepareRenderPipeline (UnityEngine.Experimental.Rendering.IRenderPipelineAsset pipe) (at C:/buildslave/unity/build/Runtime/Export/RenderPipeline/RenderPipelineManager.cs:55)
    UnityEngine.Experimental.Rendering.RenderPipelineManager.DoRenderLoop_Internal (UnityEngine.Experimental.Rendering.IRenderPipelineAsset pipe, UnityEngine.Camera[] cameras, System.IntPtr loopPtr) (at C:/buildslave/unity/build/Runtime/Export/RenderPipeline/RenderPipelineManager.cs:28)



    Does anyone know how to resolve this issue?

    Kind regards
     
  2. Desoxi

    Desoxi

    Joined:
    Apr 12, 2015
    Posts:
    195
    I do receive an error
    Code (CSharp):
    1. Shader error in '[System 1]Initialize': redefinition of 'particleIndex' at kernel CSMain at Graph.vfx(120) (on d3d11)
    when connecting a "get particle index" node to the index input of the "Set position from map" module.

    upload_2018-11-15_16-0-55.png
     
  3. Desoxi

    Desoxi

    Joined:
    Apr 12, 2015
    Posts:
    195
    How can i add "Trigger events" to the update module like it is used in the GPUEvent_Simple sample scene?

    upload_2018-11-15_17-10-55.png

    I cant find it with any of the keywords or by searching manually through the available subnodes.
     
  4. Desoxi

    Desoxi

    Joined:
    Apr 12, 2015
    Posts:
    195
    Just found out how to do it. You have to enable experimental blocks under preferences/Visual Effects
     
    terrylo109_unity likes this.
  5. racarate

    racarate

    Joined:
    Jul 14, 2012
    Posts:
    62
    With these GPU events, is there a simple way to respawn a secondary particle system with all the particles of the second system starting in the ending spot of the first system?

    The only thing I can think of is exposing a Boolean parameter and when it is true killing every particle of the first system by setting age to lifetime and THEN enabling a death GPU event.

    But I don't understand how to accomplish this if the death GPU event block is in my update context the entire time. Has anybody accomplished this?
     
  6. JulienF_Unity

    JulienF_Unity

    Unity Technologies

    Joined:
    Dec 17, 2015
    Posts:
    326
    @racarate You can do like that (more or less what you said):
    • Expose a boolean for when you want to kill all particles in your first system
    • Add a "Set alive" block in your update 1 linked to this boolean (negated)
    • Add a "trigger on die" block to update 1 with the count slot linked to a branch node with your exposed boolean used as predicate. Output 0 when your predicate is false and 1 when true. (Note that using "trigger always" will work too here)
    For later releases, we plan to have a generic way to activate/deactivate blocks based on predicates.

    Hope this helps and works. (Tell us otherwise)
     
    Last edited: Nov 15, 2018
    createtheimaginable likes this.
  7. AlexStrook

    AlexStrook

    Joined:
    Oct 19, 2013
    Posts:
    31
    Such a hassle to update, and having everything working...Any ETA on when the Visual effect graph will be in the package editor ? for people not familiar with git, its very hard to get everything up and running
    I tried to update to 2019.1.0a9

    upload_2018-11-17_16-30-31.png
     
  8. Aladine

    Aladine

    Joined:
    Jul 31, 2013
    Posts:
    195
    Hi,

    Am just wondering why i can't find some of the Blocks that exists in the template, for example "Set Lifetime Random" is inside the Initiliaze node but when i look for it i only get :

    "Set Lifetime"
    "Set Lifetime from curve"
    "Set Lifetime from map"

    but there is no sign at all on how to make a "Set Lifetime Random" block, i had to drag-drop it from the template to the my blocks.


    Any help regarding that please ? is this a bug or am i missing something ?

    thanks!
     
  9. LeapGamer

    LeapGamer

    Joined:
    Dec 6, 2013
    Posts:
    4
    How do you set an exposed sphere position/radius or transform position/rotation? I'd like to update the position of an exposed inline Sphere but having trouble figuring that out even though I can edit the settings by hand in the inspector on the component.
     
  10. Desoxi

    Desoxi

    Joined:
    Apr 12, 2015
    Posts:
    195
    Hey i just saw you question when i scrolled through the pages. A few posts ago i wrote the solution for this because i had the same question :)

    https://forum.unity.com/threads/feedback-wanted-visual-effect-graph.572110/page-4#post-3893860
     
  11. Desoxi

    Desoxi

    Joined:
    Apr 12, 2015
    Posts:
    195
    I managed to export a Vectorfield with the DCC houdini tool to use it in the VEG.
    One thing is still a problem for me and maybe some of the houdini users here can help me with it.
    The exporter expects the volume to have the same x,y and z size.
    How can i expand the houdini volume to be cubic without actually changing the shape i have inisde the volume?
    Like expanding the boundaries.

    This is the volume i want to export:

    volume.PNG

    But as you can see its not cubic. The Y size is lower than the x and z size. How can i expant the Y size to be the same as x and y?
     
    Last edited: Nov 19, 2018
  12. Desoxi

    Desoxi

    Joined:
    Apr 12, 2015
    Posts:
    195
    This seems to work with something called "volume resize" node. Though it is a bit weird to use.
     
  13. Grimreaper358

    Grimreaper358

    Joined:
    Apr 8, 2013
    Posts:
    789
    Visual Effect Graph Unite LA Talk - Really good info in the talk as well as future plans for the VFX Toolbox and VFX Graph.

    The video for those who want to watch -


    For those who just wanna see what's planned (A lot of it has been asked and answered here but now you can see what else is planned)

    VFX Toolbox
    upload_2018-11-19_9-45-14.png

    VFX Graph
    upload_2018-11-19_9-45-47.png
     
  14. kdkd

    kdkd

    Joined:
    Nov 21, 2017
    Posts:
    26
    I love the awesome GPU Trails you guys put all over the covers.
    Is there a time line for it (trails) or a specific beta branch I can download and test it out now?
     
    Last edited: Nov 20, 2018
  15. Sebboudreau

    Sebboudreau

    Joined:
    Apr 27, 2018
    Posts:
    1
    Hi ! I am trying to change the value of SetLifetime with an "Age Over Lifetime" but the node does not seem to work at all. I see that if I remove the input in the "Sine Wave" and manually move the value it does work.

    I've tried in and out of play mode but nothing happen. What am I missing ?
     

    Attached Files:

  16. racarate

    racarate

    Joined:
    Jul 14, 2012
    Posts:
    62
    Hey gang! I finally got to uploaded a 2D texture to my particle system (feeding VEG the 640x320 depth map from a Kinect, and using it to drive emission). I am noticing the weirdest thing though... I have to disable then re-enable the GameObject to get it to work.

    This is my first time feeding in a 2D texture parameter, so any number of things can be going wrong but are there any known issues where feeding a texture of a different size or something breaks the parameter binding?

    The crazy part is that I can see the depth texture updating in the parameter field. It just don't seem to be getting sent to the VEG particle system unless I disable and re-enable the GameObject at runtime.

    My workaround right now is to have a script disable then re-enable the GameObject holding the VEG particle system on frame 42 but I would love to know what is really going on here.

    P.S. The only other clue is that when it doesn't work there is an extra "HIDDEN/VFX/QUAD" component that shows up in my GameObject. When things are working properly that doesn't show up.

    P.P.S. Clicking "edit" for the VFX graph while it is running also seems to switch it from not working to working.
     
  17. bxie2

    bxie2

    Joined:
    Sep 24, 2018
    Posts:
    1
    How about Mac os x?
    I was trying to follow these two tutorials:


    but I can't reopen the project anymore. upload_2018-11-25_13-17-42.png

    I couldn't run the "FirstTimeSetup.bat." neither, even I got git for OS X.
     
  18. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    Not sure if it was just something in particular about a specific graph I've been using for months, but something changed on GitHub branch release/2018.3 in the last week that drastically changed the visual result. I eventually tracked it down to some sort of issue with a particle size parameter I had exposed. I was previously setting the value to something very low like 0.0055, and I managed to get the previous look back by changing this number to 0.055. I'm not sure what the underlying reason for this was, though I see that there was a whole bunch of renaming of size attribute to be called scale instead. Anyway I'm not reporting this as a bug since for all I know the old behaviour was the faulty one and they fixed something, or it was an issue to do with my graph or it being updated in particular, rather than a new 'fault' with the system. I'm just mentioning it in case anyone else updates and their systems suddenly look faint. And I didn't notice the issue at all to start with because it only happens once the graph is recompiled using a recent version of vfx, not upon running a graph that hasn't needed to be recompiled recently by the latest vfx system.
     
    jashan and racarate like this.
  19. kdkd

    kdkd

    Joined:
    Nov 21, 2017
    Posts:
    26
    How do I set the duration of the emitter?

    Curiously i couldn't the answer to this basic question
     
  20. andybak

    andybak

    Joined:
    Jan 14, 2017
    Posts:
    569
    What branch are people using?

    vfx/master hasn't been touched since the 14th. master itself seems broken for all the samples I've collected. /origin/release/2018.3 looks promising but I'm using 2019 with vfx graph...
     
    jashan likes this.
  21. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    It was indicated on the forum in recent weeks that the HDRP team were still mostly focussed on 2018.3 and were not quite prepared for public 2019.1 alpha, even though the current master branch is for 2019.1. So I stuck to 2018.3 betas myself for HDRP and VFX work at this moment. Having said that, in many ways master and release/2018.3 are similar, with plenty of back porting to release/2018.3 having been going on, including for the VFX system.

    As for samples, there are probably all sorts of possibilities for what's gone one, and as I mentioned in a post recently, some stuff in VFX changed last week that upset at least one of my graphs. When this happened, I decided it was probably a good idea to update all the VFX samples that I had gotten from the TestProjects/VisualEffectGraph/Assets/AllTests/VFXTests/GraphicsTests folder on GitHub. I have not updated the unity package that I offered a while ago in this thread that contained a bunch of those, and don't intend to do so since with multiple versions of things around it will be easy for me to do more harm than good and it is best for people to pick the matching versions from GitHub themselves. I will probably remove my version shortly to reduce confusion.

    I am also out of date with what is available in the package manager now, but assuming 4.3.0 recently came out in the package manager I might try and stick with that version for a while.
     
    jashan likes this.
  22. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,307
    Is 2019.1a10 supported at all? I'm getting quite a few compilation errors both with /release/2018.3 and also /master.

    After quite a bit of hassle (mostly due to me not having worked with the package manager before, and, um, not reading the instructions on the Scriptable Render Pipeline GitHub site), I finally have it working on 2018.3b11 but no luck with 2019.1a10 so far.
     
  23. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    Well I certainly wouldn't try the 2018.3 branch with a 2019 alpha of Unity.

    Its also possible that master will work with 2019 alpha sometimes, but not every week. eg there was a situation just a few weeks ago where the 2018.3 version on GitHub had a line of code in it that only worked with a newer version of the 2018.3 beta than was actually publicly available at that time, and we had to wait some days or weeks before everything was in alignment and working again. The same probably happens with master branch at times too, or there could be other reasons why it isn't working at a particular point. All I really know is that we were advised not to be using 2019.1 alpha for HDRP testing recently, I know this info will be out of date at some point and there could be a delay between this happening and me finding out, but I was certainly put off from going down that route for now.
     
  24. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,307
  25. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
  26. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,307
    I have just played around a little bit with it and it looks really cool. A few question that arose (I don't have VFX-experience, so this is from a newbie-perspective):

    1) It seems to me that having rotation for torus, cone and cube would be really helpful. For example, when I want to have a rotating torus for creating the particles, the way it's currently done, I'm not even sure this can be done at all (rotation from the transform is only applied when using local but local also moves / rotates the particles currently being simulated; I only want to animate the source of the particles).

    2) We can apparently combine position blocks, e.g. a torus and a line to create a tunnel. This seems really powerful but again, when I want to rotate such a "tunnel", it gets really tricky. Also, it would be nice to have both options: Adding the positions of different blocks (to be able to have a torus and a line), as well as what we currently have (which may also need to become a little more intuitive.

    3) Maybe this does not exist, yet, but having a way to globally scale time for the effect would be really useful. I tried to find this under update but actually, this could be a block in various contexts (Update, Initialize, Output; and/or Global). The idea is to briefly speed up or slow down individual effects, or different parts of one effect independently of the actual game time.

    4) Is this supposed to already work with VR? It's a little unclear to me if even the HDRP is working with VR - I found this VR Support in HDRP, but I'm not perfectly sure this works with this version.

    5) The parameters section has a bug where changing it's height (drag'n'drop on lower or upper edge of the panel) does not actually change the height, but does mess up the scroll area.
     
  27. GlitchInTheMatrix

    GlitchInTheMatrix

    Joined:
    Apr 12, 2010
    Posts:
    285
    Hi guys, is anywhere the Supported Data Types Sample scene able to download? wanna check how the made those effects.

    Im talking about today unity blog news LINK
     
  28. eric_delappe

    eric_delappe

    Joined:
    Dec 10, 2015
    Posts:
    26
    Will there be a way to spawn particles on the surface of a skinned mesh renderer, as is possible with the shuriken shape module?

    Will there be integration with shuriken? I would like to trigger a GPU event every time a shuriken particle is emitted or dies, for example.
     
  29. JulienF_Unity

    JulienF_Unity

    Unity Technologies

    Joined:
    Dec 17, 2015
    Posts:
    326
    Yes we have recently changed the way we store size. Before size was a vector3 that default to 0.1 and when only the x component was used, the size was considered uniform (This was made to avoid storing useless data: 1 float for uniform size instead of 3). This caused some issues. Now we change to 1 float for size attribute (default to 0.1) and a float3 for scale attribute per axis (default to 1). If in your systems you used other component than x for the size, it is converted to scale for the new version. Therefore with the default size being 0.1, you have all your particles scaled down to 10%.

    TLDR: Use size for uniform size and only scale attribute components you need for non uniform size.

    1) Yes we plan on adding a way to transform shapes. In the meantime you can always transform you position after by adding a set position block connected to a transformPosition node, itself connected to a getPosition node.

    3) No at the moment you can only set it via script via VisualEffect.playRate. But we can think of adding better way to control it (via timeline for instance).

    4) This is more a HDRP question than a VFX Graph question.

    Yes spawning on mesh (without the need of point cache conversion) and skinned mesh is planned.

    No Shuriken / VFX Graph interoperability is planned. Instead we plan Shuriken features to be a subset of VFX Graph features in the future so everything you can do in the current particle system will be doable in VFX Graph.
     
  30. racarate

    racarate

    Joined:
    Jul 14, 2012
    Posts:
    62
    Is motion blurring currently supported in VEG? I see the toggle for it in the lit quad shader but I am having no love seeing the results of it... Does anybody have motion blur working in VEG?
     
  31. Kinas10

    Kinas10

    Joined:
    Mar 10, 2018
    Posts:
    1
    Currently as I'm trying to import it into unity 2019.1(I've got everything correct there), In the hdrp, it imports correctly, saying the usual 'fully updated' and so, however, when I actually go to rightclick in my project, it doesn't show up.
     
  32. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,307
    Is there any documentation available, in addition to the demo project (which has nice notes - that was quite helpful)?

    More specifically, I'm looking for a way to generate particles based on a texture. The particles should use the RGB-colors and particle instantiation probability per pixel should be based on the alpha value.

    In a second step, I'd love to be able to attract those same particles to another image based on the color values (i.e. particles from image A shall be attracted to locations in image B where the pixels have the same or a similar color to the given pixel).

    How would I go about this?
     
  33. racarate

    racarate

    Joined:
    Jul 14, 2012
    Posts:
    62
    @jashan I just birth the particles on the texture and set life to zero if they are outside my mask (currently using a black-white body mask texture from orbbec astra)

    the second part is harder, it sounds like a n^2 problem... like the n-body problem... dont think there is a way to currently loop through all particles for each particle... would love guidance on how to do that in a custom node though, these things are just compute shaders after all!

    maybe a probabalistic approach would work where you randomly sample eight locations in the secondary texture... use an unused float (like texturedindex) as your boolean for haveICheckedTheSecondImageYet or instead of a boolean use a float timer then recheck when it reaches zero

    p.s. if your images are not generated every frame maybe you can make three SDFs, one for each color channel of your image
     
    jashan likes this.
  34. JulienF_Unity

    JulienF_Unity

    Unity Technologies

    Joined:
    Dec 17, 2015
    Posts:
    326
    @jashan First point is easy. I did a very quick graph to demonstrate.

    Generate uvs (here randomly) and sample the texture, use a compare with the sampled alpha and a random number to keep particle based on probability defined by the alpha (Alive attribute set to false in the init will kill it before it's even born). Use sampled color for the particle color and uvs for position.

    The small trick here is how random number are generated. For uvs the randoms are set to be constant per particle. Meaning each time it's evaluated, the node will output the same random number (but each particle has its own). The hash is used to control correlation. It is set to a different number (here 0 and 1) so that we dont get the same random number for u and v. If the randoms for uvs were not set to constant per particle, each evaluation for the same particle (once per block) will return different randoms. For the probability random, we dont care, so constant is not set.

    For the second point, it can hardly be achieved efficiently without preprocessing some data. If you know both texture before hand you can select one or a few destination pixels for source pixels and store them in texture for instance. Racarate suggestions seems viable as well. (Just a note instead of using an unsused attribute you can also create your own named one using custom attributes that you wan set and get by name)

    For documentation there's some on the wiki of the github srp (but quite basic only and some part probably outdated). https://github.com/Unity-Technologies/ScriptableRenderPipeline/wiki/Visual-Effect-Graph

    Hope this helps and keep us informed of your progress :)
     

    Attached Files:

    Last edited: Nov 30, 2018
    jashan likes this.
  35. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,307
    Cool, thank you both @racarate and @JulienF_Unity! The SpawnOnTexture image helps a lot - I'll give this a try right now. Also, I just read through the whole documentation and this also cleared a few things up.

    Is there a reason why we cannot route multiple Initialize contexts into a single Update context? I can route multiple Spawn contexts into a single Initialize but apparently, Update can only take a single Initialize. Maybe it's because of capacity ... if so, it would be cool if those capacities would simply be added up. The reason I'd like to have this is because I need one effect where particles are generated in 8 different, independent locations - but once they are created, they should get the same treatment. I can probably work around this by re-building the Position (Line) Block, so I can have several of those and then Set Position, but multiple Initialize contexts would be much more convenient.

    Regarding the "moving particles from one texture to another", I can do pre-processing and one solution might in fact be having different particle systems (i.e. Visual Effects / Initialize / Update paths) for different color ranges, and then splitting those images based on the color ranges. This will probably involve some significant authoring on my end but that will be necessary, anyways, due to how those color-particles are meant to behave (e.g. white particles will do different things compared to violet particles).

    Btw, I just noticed today that the bug I reported (5 / Blackboard resizing) seem to be sporadic. When I resized that panel today, it all worked as expected and I wasn't able to reproduce the issue.

    Once this thing is ready, I'll probably make it available as a demo - with me luck, and, um, smarts ;-)
     
  36. hunz

    hunz

    Joined:
    Oct 16, 2012
    Posts:
    29
    Do you know if there will be any field visual feedback when doing Time * X into other nodes? At the moment when I hook anything up into a float / vector using this setup it's represented by "-" in the field. I would love to see the float / vector values going up and down in value given the operations chain I've put before it.

    Maybe it's there and I haven't enabled it too :D

    Thanks for an amazing tool.
     
  37. Jet-Systems

    Jet-Systems

    Joined:
    Mar 10, 2017
    Posts:
    4
    Yop everyone !

    Any news regarding the this vector field thing ?
    Does anyone knows how to create one from a mesh ?

    Here is what I found :
    - This Blender Plugin to export FGA files : https://github.com/isathar/Blender_UE4_VectorFieldEditor
    - This script to convert FGA Files to 3D Textures : https://realtimevfx.com/t/fga-to-3d-texture-question/6305

    Thought, when you export the FGA from Blender, you have to change the extension to .txt for it to work with the converter script.
    A pure little mess haha.

    But the point is that I've got some weird results using this technique. Particles seem to follow a path but well, doesn't seem to be the right one.

    Thanks for your help !
     
  38. racarate

    racarate

    Joined:
    Jul 14, 2012
    Posts:
    62
    I would really recommend the Houdini tools they released, it works with the free version of Houdini and also takes care of authoring SDFs:

    https://github.com/Unity-Technologies/VFXToolbox

    The real thing that would help is if there were a better way to visualize the vectorfield INSIDE of Unity, or if the node allowed you to specify a falloff instead of relying on clamping or tiling the 3D texture.

    P.S. In terms of creating a vectorfield from a mesh, how it is usually interpreted is that the vector field encodes the flow AROUND a mesh. For example, in Houdini if you plug in a SDF to the curl noise field you get curl noise that flows somewhat AROUND an object:



    That curl noise is just an approximation of the real way, which is to bake out some frames of an actual navier-stokes fluid simulation... gas or liquid or plasma or whatever... with your mesh as an obstacle so you get nice flow lines. See the Valve talk on (2D, shading-only) vector fields from 2010:

    https://steamcdn-a.akamaihd.net/apps/valve/2010/siggraph2010_vlachos_waterflow.pdf
     
    Last edited: Dec 1, 2018
  39. Jet-Systems

    Jet-Systems

    Joined:
    Mar 10, 2017
    Posts:
    4
    Hey @racarate,

    Thanks for the info, I'll check this out and see I'm confortable with it ^^
     
  40. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,307
    Another question for @JulienF_Unity: Do you have an ETA for when LWRP or, more specifically, VR Lightweight RP will be supported?

    For the current test project that I'm working on (can't wait to share ;-) ), we're not even using light sources: It's all just particles and one type of object that is currently using the HDRenderPipeline/Unlit shader (with surface type: Transparent, and emission but no Fog or Pre Refraction Pass).

    I have almost all features in the HDRenderPipelineAsset disabled, and VR seems to be working fine now but we may actually be better off using the lightweight pipeline.

    We do make heavy use of postprocessing effects, though. Also, we use Linear and HDR.
     
    MadeFromPolygons and Alverik like this.
  41. asa989

    asa989

    Joined:
    Dec 18, 2015
    Posts:
    52
    i love the fact that we can see the compute code. is there any way that we can modify it or get data(compute buffer) from it in C# side? first line in compute generated code says its a copy and modifying it wont effect the original code!


    Great job on VFX graph using Compute shader though.
    Cheers.
     
  42. dpeter99

    dpeter99

    Joined:
    Sep 29, 2013
    Posts:
    7
    Hey
    I'm trying out VFX graph, and trying to do a effect where i project a grid onto what the player can see
    upload_2018-12-3_16-40-45.png
    Got this graph so far but it won't really do anything in regards of moving the points on the z axis

    This stuff is great by the ways. Keep up the good work
     
    Alverik likes this.
  43. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,307
    I will just say three letters: O.M.F.G.!!! ;-)

    But I'm afraid I'll need to get myself a 2080 Ti now, for the lulz.
     
  44. carpetneon

    carpetneon

    Joined:
    Mar 5, 2018
    Posts:
    3
    Hi all,
    i'm trying to work out how to get control of custom Variable Parameters which i create in the VFX graph window to drive things in the graph.. for example to control Rate via C# script


    I was able to do this with Shader Graph by accessing the Renderer Component like below


    private Renderer thisRend;

    thisRend = GetComponent<Renderer>();

    thisRend.material.SetFloat("VectorBlabla" floatvalue);

    can anyone shed some light on how to access floats(or ints or anything) created in the VisualEffects Graph??
     
  45. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,502
    It is very similar, something like this:

    using UnityEngine.Experimental.VFX;

    private VisualEffect thisVFX;

    thisVFX = GetComponent<VisualEffect>();

    thisVFX.SetFloat("MyParameterNameGoesHere", floatvalue);
     
    PixelPup and asa989 like this.
  46. AnException

    AnException

    Joined:
    Apr 3, 2016
    Posts:
    3
    Hi!

    I wanted to try out a simple effect I made with the VFX graph on Xbox One packaged as an UWP game, but it didn't appear in the scene. Packaged the same project for PC (not as UWP) and it worked correctly.
    Is the UWP already supported, and did I miss something? If not, is the support for UWP planned?

    Edit: it seems like the problem appears only if the Graphics API is set to DX12. Sadly I'm having issues using HDRP on Xbox with DX11. But tested on PC (as UWP package) with DX12 the effect didn't appear, with DX11 it did. Any ideas resolving this?

    Thanks!
     
    Last edited: Dec 4, 2018
  47. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,307
    It seems that the VFX-files in the project, that store the effect graph, change without the user making any actual changes: I have put my little test-project under version control today. Open the project, didn't do anything (but one of the graphs was opened because that pane was still open) - got quite a few changes in the VFX-file. Opened another one, without doing anything in the graph, that also got changed.

    From the diff, it almost looks like there were actual functional changes but it's a bit hard to read. One possible explanation might be that some changes that I had applied in my last session were stored in some sort of cache and now written into the VFX-files. But it's definitely strange.

    One thing that apparently throws off the text-diff is that whitespace in the shader code is escaped, so instead of new lines and tabs, there are plenty of \t and \n, and it's just one big blob of text. This will make merging or generally working with version control (also to understand changes that were made in the process of polishing an effect) very, if not impossible.

    Hopefully, there is an easy way for you to put that generated code into separate files that don't require any escaping. If all the information needed to autogenerate those shaders is in the other parts of the VFX file, it would be preferable to not even version control those generated shader files (so they would need to have an extension or something that makes it easy to put them into gitignore). If there is information in there that is read by the graph editor, the files should be "version-controllable" (i.e. separate files with proper whitespace).
     
  48. Desoxi

    Desoxi

    Joined:
    Apr 12, 2015
    Posts:
    195
    Hey guys, i've made a short tutorial video on how to generate SDFs and point caches from meshes with houdini using the export plugin from unity. Hope this helps getting started with SDF fun :)



    Going to post more if i have the time: https://twitter.com/ADesoxi
     
  49. Desoxi

    Desoxi

    Joined:
    Apr 12, 2015
    Posts:
    195
    Do you know a good way on how to export the generated trails with the curl noise as volume for use in unity?
     
  50. carpetneon

    carpetneon

    Joined:
    Mar 5, 2018
    Posts:
    3
    thanks Elbows!!! :)
     
    elbows likes this.
Thread Status:
Not open for further replies.