Search Unity

  1. Unity 2019.2 is now released.
    Dismiss Notice

Feedback Wanted: Visual Effect Graph

Discussion in 'Graphics Experimental Previews' started by ThomasVFX, Oct 21, 2018.

  1. JulienF_Unity

    JulienF_Unity

    Unity Technologies

    Joined:
    Dec 17, 2015
    Posts:
    68
    @jpfjord

    "Set Angle over life time" should work. Assuming your particles have a life time (are not immortal), Else you can use "Set Angle over age" (immortal particles can age).

    For the angle, also pay attention to the axis you want your rotation about (For billboard it will generally be Z axis).

    We also have an attribute called "Angular Velocity", so you can specify a rotation speed per axis and at each update, particle angles will be integrated based on their angular velocity.

    If you still have issues getting this work, maybe a screen of your graph can help.
     
  2. thinksquirrel_lily

    thinksquirrel_lily

    Joined:
    Feb 8, 2011
    Posts:
    1,175
    Thanks for the clarification on index. Great to hear that you'll be adding spatial partitioning and random read access as well—we made some code to specifically do that in fact.

    It sounds like once these steps are taken (and an API is finalized enough to be public by default), we should be able to do everything with some vanilla C# and no reflection.

    Although I do have a question. How would you approach spatial partitioning by just using the particle buffer? Unless I'm mistaken, doesn't that buffer have a size of n * dataSize where n is particle capacity (aligned maybe) and dataSize is the amount of data per particle?

    In old Fluvio, we used an infinite grid structure with a size of x * y * z * maxParticlesPerCell, which is a completely different size than the particle buffer. We also used another buffer with a size of capacity * maxNeighbors to store the neighbors of each particle for subsequent steps of the simulation.

    Ignoring traversal concerns, I guess you could do some sort of sparse structure where we store an index per-particle (and this would save a ton of memory), but I'm not sure how we'd work around the issue of neighbors. Especially since we use multiple kernels (update contexts) for sync, and need that neighbor data for several steps.
     
  3. PixelPup

    PixelPup

    Joined:
    Mar 6, 2018
    Posts:
    18
    I have been playing around with linking exposed parameters to UI elements like sliders and so far so good, but I ran across this section in the API that I am not sure I understand. It says...

    "Every function can refer to a parameter using either a string name or an int nameID. If you need to perform access to these parameters on a per-frame basis you should cache the name and perform calls with the ID using Shader.PropertyToID()"

    I am calling the exposed parameters by name right now and it is fine, but if I wanted to have the property be updated every frame I am not sure how to use the instructions correctly in this context. Any insight or code examples floating around for this?
     
  4. PaulDemeulenaere

    PaulDemeulenaere

    Unity Technologies

    Joined:
    Sep 29, 2016
    Posts:
    9
    @PixelPup
    The Shader.PropertyToID() function is fetching through a map to retrieve associated ID from a string. It can be expensive if you are doing it every frame.

    As material, it's sometimes preferable to cache the result of this Shader.PropertyToID().

    Typically, you can replace this kind of code :
    Code (CSharp):
    1. void Update()
    2. {
    3.     myVisualEffect.SetVector3("exposed_name", Vector3.zero);
    4. }
    5.  
    by
    Code (CSharp):
    1.  
    2. private static readonly int s_exposedNameID = Shader.PropertyToID("exposed_name");
    3. void Update()
    4. {
    5.     myVisualEffect.SetVector3(s_exposedNameID, Vector3.zero);
    6. }
    7.  
    Actually, the implementation of UnityEngine.Experimental.VFX.VisualEffect.SetVector3 is already a redirection :
    Code (CSharp):
    1. public void SetVector3(string name, Vector3 v)
    2. {
    3.     SetVector3(Shader.PropertyToID(name), v);
    4. }
     
    PixelPup likes this.
  5. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,085
    Oh my God - you just made my day. No. Week. No. Month! ;-)

    Only unlit outputs is no problem for us: The use case we have is a purely particle based reality so we might not even have lights in the scene. The big issue we were facing was that this is for a fairly old project (currently still on 2017.4 and I am waiting to upgrade to 2018.3 or 2019.1 until the very last moment because I expect a lot of trouble) ... and from what I have seen with HDRP so far, it's fine if you begin a new project with it but moving a large legacy project to HDRP might kill the project. Plus, it's a VR project, and so far, it looks like HDRP adds way too much overhead, so we're more likely to go with LWRP at some point but most likely will keep using Legacy until we absolutely have to move on.

    With the typo in Legacy fixed, I can test if our 2018.3 prototype project works with Legacy, and if it does, put the two things together and finally push out a demo ;-)
     
  6. JulienF_Unity

    JulienF_Unity

    Unity Technologies

    Joined:
    Dec 17, 2015
    Posts:
    68
    Spatial partitionning will indeed use an additional buffer. I dont have technical details to share now but the structure has to be a tradeoff between fast access pattern (ie minimizing scattered dependent lookup) and memory footprint.

    For neighbors caching, it looks like attribute buffer could be used for this purpose and give efficient access pattern. We use SoA for attribute buffer so a given attribute is stored contiguously in memory for all particles. Looks like all we need here is to add support for attributes as array, something like neighborIndex[N] in your case.

    I wont go too much deeper into technical details in this thread but I'm happy to continue this conversation in PM if you wish.

    Some other random things that can be of interest in your case:
    • Solutions for fluid rendering are planned. Basically a built in screen space fluid output (ie a post blurred version of the sphere output)
    • On the UX side we plan to decouple the 1-1 relationship betwen contexts nodes and actual compute passes. So in your case instead of having to chain update contexts with specific blocks at each step, you could write a Fluvio update context for instance that generates all the passes you need and still allows behaviour composition via blocks for the final simulation pass.
     
    Last edited: Jan 9, 2019
  7. jashan

    jashan

    Joined:
    Mar 9, 2007
    Posts:
    3,085
    Feature Request: Moving back in Time

    In our project, one fun feature we'd love to implement is the ability to arbitrarily move back and forward in time while doing session replays. At first, I thought I'd have to create a custom implementation to make this work for audio, and there is one small thing that we do have to do - but that's almost trivial (negative pitches do work, how cool is that!? ;-) ). As most of our game logic already uses AudioSource.time instead of Time.time, a negative pitch means that we already can move backwards in time.

    One trickier part is being able to revert our Visual Effects. I did a little experiment with setting playRate to negative values - but that only stops everything. It's probably clamped to 0 internally.

    Is that something that would even be feasible?

    One obvious requirement would be that everything is deterministic, and we would need to be able to set a "simulation time" so that, for example, when time is passing at -1, an effect that usually only lasts 2 seconds would be instantiated / started two seconds "before" it starts, then played backwards for two seconds, then "disappear".

    Internally, the same would have to be done for each individual particle. But the cool thing is that it would allow "scrubbing" complex effects back and forth during design time, too, so I think it would be a rather useful feature to have beyond our rather specific use case. There are obvious limits to feasibility: When there are forces involved, you'd need to know the past (which from that perspective, really is the future ;-) ).
     
  8. elbows

    elbows

    Joined:
    Nov 28, 2009
    Posts:
    2,226
    Desoxi likes this.
  9. VladVNeykov

    VladVNeykov

    Unity Technologies

    Joined:
    Sep 16, 2016
    Posts:
    39
    Hey jpfjord, you are correct - Angle over Life is the way to go. The classic use case for quads is to just rotate on the Z axis. Please note that the curve key values need to be set to the angle you want at that point (e.g. in the image here, the particle will rotate from 0 to 360 degrees over its lifetime).

     

    Attached Files:

  10. thinksquirrel_lily

    thinksquirrel_lily

    Joined:
    Feb 8, 2011
    Posts:
    1,175
    All fantastic stuff! I'll get in touch via PM.

    In general, I am so excited for the new open direction Unity is taking with packages, with great consideration to a variety of use cases. How great the VFX system is to work with really shows here.
     
  11. daniel5johansson

    daniel5johansson

    Joined:
    Jun 21, 2018
    Posts:
    1
    Feature Request/Bug Report: "Orient: Along Velocity" should only rotate around Z axis.

    I'm not sure if "Orient: Along Velocity" behaves like it should. Currently it doesn't always point particles to the camera. This sometimes makes them infinitely thin when they change direction.

    I use "Orient: Along Velocity" to fake motion blur on fast moving particles, by stretching them along the axis they're moving in. But I also want the particles to always face the camera, keeping their original size and aspect ratio.

    The solution to this would be to only orient the particle based on its velocity on the X and Y axis relative to the camera. So, first do what "Orient: Face Camera Plane" does, then rotate the particles around their Z-axis to point in the direction of their velocity.

    Is this possible? Thanks
     
  12. PixelPup

    PixelPup

    Joined:
    Mar 6, 2018
    Posts:
    18

    OK, that took me a minute to grasp, but once I read the documentation about Shader.PropertytoID(); it made more sense. I am wondering though about these exposed properties though. Right now in order to access them with a slider I am having to re-assign them with a script in order to get them to show up as a non-static variable to the UI components. It's not the worst thing, but it's an extra step,..
     
  13. vjrybyk

    vjrybyk

    Joined:
    Nov 19, 2015
    Posts:
    3
    Thanks for taking the time to look into and fix it, very much appreciated!
     
  14. jpfjord

    jpfjord

    Joined:
    Dec 19, 2017
    Posts:
    2
    @VladVNeykov I actually went with AngularVelocity as I wanted random numbers between both positive and negative rotation. I have not seen how to a random between two curves in the VEG....am I just missing it? However now I notice that also that AngularVelocity doesn't seem to care about physics at all. I'd love to be able to change the angular velocity based on collision for example.

    Feedback: A gizmo representing a turbulence block would be pretty helpful.
     
  15. hvent90

    hvent90

    Joined:
    Jun 18, 2018
    Posts:
    19
    Hi all,

    Quick question on preventing the culling of particles:

    I have a single burst of particles with an infinite lifetime that is attracted to another object. This object moves away from the VFX game object, and the particles are culled when the VFX game object is not in the camera's view.

    How can I see the particles when the VFX game object is not in the camera's view? If I attach the VFX game object's transform to the attractor object, then the particles do not physically move correctly (they are translated but the particles do not have any drag).

    Cheers!
     
    PaulBrooke likes this.
  16. Danua

    Danua

    Joined:
    Feb 20, 2015
    Posts:
    183
    Try to extend bounding box
     
  17. id0

    id0

    Joined:
    Nov 23, 2012
    Posts:
    287
    I wonder can I do something like this? Not about collison, but just not render particles in special areas.

    RainArea.jpg

    And how?
     
  18. VladVNeykov

    VladVNeykov

    Unity Technologies

    Joined:
    Sep 16, 2016
    Posts:
    39
    You can use a Kill (AABox) block in Update:
    KillBoxBlock.PNG
     
    HypeMitch, Adam-Bailey and id0 like this.
  19. JulienF_Unity

    JulienF_Unity

    Unity Technologies

    Joined:
    Dec 17, 2015
    Posts:
    68
    Rewinding particles system has always been a challenging task. Mainly due to the fact that particles dying at a frame need to birth again when rewinding at the same frame. We dont have plans to handle that in the future but it should be possible to build a custom solution somehow. About the playRate being clamped to 0, we didnt want to handle all possible corner cases due to negative dt, but this is probably something that we can take a look at and see how feasible it is to enable it. Another solution would be to rerun the full simulation at a given time (using the prewarm mechanism) to fake the rewind. Problem is that it can be quite expensive due to the number of update steps needed.

    Orient Velocity work as expected: Up axis of particle is aligned to velocity and then particles are rotated around that axis to face towards camera. This can imply strange rotation effects when the velocity vector is nearly collinear to view vector. One way to avoid that would be to use some kind of screen space capsule for particles rendering which is something we may look at in the future. We also have plans to add a more generic Orientation block to have more controle over your axis and constraints. In the meantime you can also set your 3 axis in your output by using nodes (basically with a couple of cross products to get an orthonormal space orientated the way you want).

    It's probably an issue with your system bounds. You can edit them in the Initialize context (selecting an init context with a VisualEffect attch will spawn a box gizmo in your scene). In your case, you may want to pass your simulation in world space (by clicking on the local in the top right of a context) and then attach your VFX component to you attractor.

    You need a representation of your scene in a way or another. Vlad's suggestion will work but only for simple scenes as you need to manually describe your scene using primitive boxes. There are many ways of achieving this. Another way of doing this effect (more technical though) would be to render the scene height (depth) around your camera in top down ortho view and use it in the vfx graph to compute occlusion and whether or not rain particles needs to be killed based on height comparison. Using that you could also spawn splashes at intersection points for instance. Maybe we'll build some samples to demonstrate such technique in the future.

    There's no built-in interpolation between two curves in a block at the moment but you can do it using nodes:

    2-curves-lerp.png

    As for rigid body simulation, we may implement and make a block for that somewhere in the future.
     
    Last edited: Jan 10, 2019
    id0 and jashan like this.
  20. Desoxi

    Desoxi

    Joined:
    Apr 12, 2015
    Posts:
    121
    Is there a way to lerp the position of particles between two point caches?
     
  21. echuang98

    echuang98

    Joined:
    Oct 21, 2015
    Posts:
    17
    Hello! I am trying to use Visual Effect Graph to make some point-cloud style visuals.
    This is what I want to achieve: point cloud with its colors!
    I can get this colored point cloud with the MaskFace PointCacheAsset from the official sample project:
    MorphFace.png
    I want to use other models and I downloaded a beatiful 3D scanned obj file ( "More Model Information" says that it contains vertex color) from this site
    upload_2019-1-11_16-55-31.png
    I imported this obj file into Unity, and used pCache Tool to generate a pCache file. Below are my settings:
    pCacheTool.png
    When I replace MaskFace PointCacheAsset with the PointCacheAsset I just generated, there is no color on it!
    RuttenStump.png
    I was wondering if I did something wrong or I misunderstood something. It would so great if anyone could help! Thank you!
     
  22. nonnicram

    nonnicram

    Joined:
    Apr 3, 2014
    Posts:
    17
    Hey Guys,

    i am trying to access via c# the exposed parameters of the visual effects graph. But i don`t know how to access them. I don`t really understand the official documentation to it. What i actually want to do is to control an "exposed" float via Script. I have enabled the "expose" option in the graph for an float, but can`t find the variable. The script is attached to the gameobject where the visual effect object is attached to.

    With this code i get an "Null reference exeption".

    Any help is appreciated!

    Code (CSharp):
    1.  
    2. using System.Collections;
    3. using System.Collections.Generic;
    4. using UnityEngine;
    5. using UnityEngine.VFX;
    6. using UnityEditor.VFX;
    7. using UnityEditor.Experimental.Rendering.HDPipeline;
    8. using UnityEditor.VFX.UI;
    9. public class particleMainScript : MonoBehaviour
    10. {
    11.     UnityEngine.Experimental.VFX.VisualEffect visualEffect;
    12.     float exposedParameter;
    13.     void Start()
    14.     {
    15.         visualEffect = this.GetComponent<UnityEngine.Experimental.VFX.VisualEffect>();
    16.         // exposedParameter = visualEffect. -> what to insert here to access the expoosed parameter?
    17.     }
    18.     void Update()
    19.     {
    20.         if (Input.GetKey("down")){
    21.             visualEffect.Stop();
    22.         }
    23.         else
    24.         {
    25.             visualEffect.Play();
    26.         }
    27.     }
    28. }
    29.  
     
    Last edited: Jan 11, 2019
  23. echuang98

    echuang98

    Joined:
    Oct 21, 2015
    Posts:
    17
    I opened the .obj file with Notepad. There is color information:
    49351472_2035261626562360_8750338611049660416_o.jpg

    I also opened the .pcache file generated with pCache Tool. And I found that the colors become 1,1,1,1 or 0,0,0,0 or even 0.99999
    50301904_2035263103228879_5339131187765968896_n.jpg

    I tried several different settings in pCache Tool but no good.
    Hope someone would shed some light on this!
     
  24. PixelPup

    PixelPup

    Joined:
    Mar 6, 2018
    Posts:
    18

    In order to do this, you need to include using UnityEngine.Experimental.VFX;
    Then I assigned public variables to things I wanted to change with say a UI slider, and then you can use _vfxobject.SetFloat("exposed parameter name string", float value);

    I could not find a more direct way to access them for a UI element as I needed public non-static floats for those.

    Here is the example I made with it...
    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4. using UnityEngine.Experimental.VFX;
    5.  
    6. public class VFXVariable : MonoBehaviour
    7. {
    8.  
    9.     [SerializeField]
    10.     private VisualEffect _vfxObject;
    11.     [SerializeField]
    12.     public float _spawnrate { get; set; } = 250f;
    13.     public float _turbulence { get; set; } = 0.2f;
    14.     public float _gravity { get; set; } = 0f;
    15.  
    16.  
    17.     private void Start()
    18.     {
    19.      
    20.     }
    21.  
    22.     private void Update()
    23.     {
    24.         _vfxObject.SetFloat("variableSpawnRate", _spawnrate);
    25.         _vfxObject.SetFloat("turbulenceIntensity", _turbulence);
    26.         _vfxObject.SetFloat("Gravity", _gravity);
    27.      
    28.  
    29.     }
    30.  
    31. }

    A video of the result.
     
  25. PixelPup

    PixelPup

    Joined:
    Mar 6, 2018
    Posts:
    18
    Also, for anyone who is curious, VectorRayGen plugin does work to convert standard .fga files into something VFX graph can understand. It just works. The fact it can also affect Rigid Bodies or CPU particles is a bonus for sure.
    This gives some interesting things to play with as you can chain multiple vector fields in the update.
     
  26. vx4

    vx4

    Joined:
    Dec 11, 2012
    Posts:
    94
    Feature Request:
    add node to sample SDF for procedural shape(box,sphere,2d circle etc), in addition simple SDF generated from list of sphere.Also add node based editor for procedural SDF to do animation and do other operation to SDF(mainly focus SDF)
     
  27. id0

    id0

    Joined:
    Nov 23, 2012
    Posts:
    287
    I do not understand where is parameter for particles use light probes.(in graph or whatever) When I first time add prefab with particles to the scene I see the “VFX Renderer” window and I can even change this parameter, but after applying the prefab, it's break on zero again, and this window are disapperear. I can see changes in the prefab, but after add this prefab on scene - parameter is zero again. What kind of magic is this?
     
    Last edited: Jan 14, 2019
  28. caiqueassis

    caiqueassis

    Joined:
    Oct 13, 2018
    Posts:
    8
    Hello!

    I was messing around with the VFX Graph, trying to replicate the look / behaviour of some Particle Systems I had here in order to get the hang of it.

    However, it seems that some of the things that are available in the Particle System are not in the VFX Graph - things like Trails, dampening speed / velocity, choosing the color from more than one gradient and Lights, for instance.

    Is there any way to add those behaviours to the VFX Graph? Do I need to "make them by hand" via the nodes, will they be added in the future, or would I need to add them myself via C# when the API becomes public?
     
    Last edited: Jan 14, 2019
  29. VladVNeykov

    VladVNeykov

    Unity Technologies

    Joined:
    Sep 16, 2016
    Posts:
    39
    Hey echuang98,

    I might be wrong, but I'm not sure whether obj's support vertex colors by default (my exported from 3ds max doesn't seem to mention any), so the file you are using could be using something custom. You can try attaching a vertex color shader (like some of the particle shaders in the built-in pipeline) to see if your mesh's vertex color show.

    I've attached a simple fbx cube mesh here with vertex-colored sides for you to try out in the VFX graph: https://drive.google.com/open?id=11vlYLeEkHhW5svlRGlY-5EQflpIJATdS
     
    createtheimaginable likes this.
  30. VladVNeykov

    VladVNeykov

    Unity Technologies

    Joined:
    Sep 16, 2016
    Posts:
    39
    You are correct, the VFX Graph is under continuous development (available as a preview package at the moment) so a lot of functionality is still to be implemented.

    - To dampen speed, you can use a Linear Drag block in update.
    - Trails as you noted are not implemented yet, although for some scenarios there are some workarounds, usually involving using Position and Target Position and using a Connect Target block in update make each particle point to the position of the next one (which you can know if you've set both manually in some predictable fashion, so it doesn't cover all scenarios).
    - For a random between two gradients over lifetime, you can do something like this:
    2019-01-14_16-46-05.png - Lights are not implemented.
     
    jashan, caiqueassis and PixelPup like this.
  31. Danua

    Danua

    Joined:
    Feb 20, 2015
    Posts:
    183
    How to start drag particle after 2 second?
     
  32. thinksquirrel_lily

    thinksquirrel_lily

    Joined:
    Feb 8, 2011
    Posts:
    1,175
    Hm, is there any way to bake a VFX system (or any plans to implement this)? I'd love to experiment with simulating VFX and then baking the data (sacrificing interactivity) for later playback.
     
  33. echuang98

    echuang98

    Joined:
    Oct 21, 2015
    Posts:
    17
    Hi VladVNeykov,
    Yes, I think you're right!
    I think the obj file does have vertex color because I can see the color information when I open it with Notepad. I can also see its colors in MeshLab.
    However Unity can't import vertex color from obj files.
    NoColorFromObj.png
    I export the obj in dae format with MeshLab. And Unity can import the colors from the dae!
    ColorFromDae.png
    I used pCache Tool to convert this mesh to a pcache file. With this pcache I can display the colors on the point cloud now!
    Thanks a lot :D
     
    Last edited: Jan 15, 2019
    VladVNeykov likes this.
  34. Deytooh

    Deytooh

    Joined:
    Jan 14, 2015
    Posts:
    6
    Hey ! I am currently downloading VFX Graph in order to try it but I have a compilation error so I cant do anything.

    Unity Version : 2019.1.0a13
    VFX Graph Version : 5.2.3

    Error :

    "Library\PackageCache\com.unity.visualeffectgraph@5.2.3-preview\Editor\Expressions\VFXExpressionTransform.cs(64,47): error CS0117: 'VFXExpressionOperation' does not contain a definition for 'InverseTRS'"

    In VFXExpressionTransform.cs :

    class VFXExpressionInverseMatrix : VFXExpression
    {
    public VFXExpressionInverseMatrix() : this(VFXValue<Matrix4x4>.Default)
    {}

    public VFXExpressionInverseMatrix(VFXExpression parent) : base(VFXExpression.Flags.InvalidOnGPU, parent)
    {}

    public override VFXExpressionOperation operation
    {
    get
    {
    return VFXExpressionOperation.InverseTRS;
    }
    }

    sealed protected override VFXExpression Evaluate(VFXExpression[] constParents)
    {
    var matrix = constParents[0].Get<Matrix4x4>();
    return VFXValue.Constant(matrix.inverse);
    }
    }


    this line ".InverseTRS"

    what should I do ?
     
    Danua likes this.
  35. whidzee

    whidzee

    Joined:
    Nov 20, 2012
    Posts:
    83
    How would you go about spawning particles from particles? I am thinking of something like fireworks. you could have some particles shooting up into the sky and when they reach the top they explode by spawning a bunch of new particles. These new particles could have different effects on them like gravity. Also you could use a system like this for volumetric clouds. i'd imagine you could use particles spawned across the sky to be cloud seeds and from each cloud seed you could have a cluster of particles spawn around it which would be the cloud itsself, including motion inside themselves. but the clouds could be moving across the sky as the cloud seeds would be pushed at a certain rate.

    How would I go about achieving this?
     
  36. PaulDemeulenaere

    PaulDemeulenaere

    Unity Technologies

    Joined:
    Sep 29, 2016
    Posts:
    9
    @Deytooh
    Sorry for the inconvenience, the current released package isn't compatible with the last public alpha. We are currently publishing another package.

    You can use the last 2018.3, or alternatively, if you really want to use 2019.1, use a local package modifying your manifest.json after downloading or cloning the master branch at https://github.com/Unity-Technologies/ScriptableRenderPipeline
     
  37. patrick_scheper

    patrick_scheper

    Joined:
    Mar 18, 2015
    Posts:
    16
    Someone here asked about the target position from map, and I would really appricaite if soemone can eleborate on that.

    What does the 'connect block' exactly mean? How is it used? I would like the particles to flow to a specific point, based on an attribute map. Is that possible with the target position?

    Thank you so much.
     
    createtheimaginable likes this.
  38. NateMohler

    NateMohler

    Joined:
    Feb 27, 2018
    Posts:
    1
    Hello all,

    Has anyone written a script to trigger an event send so that particles spawn upon mouse at mouse click location? I thought this would be easier than it's turning out to be.

    Thanks in Advance!
     
  39. thinksquirrel_lily

    thinksquirrel_lily

    Joined:
    Feb 8, 2011
    Posts:
    1,175
    Last edited: Jan 18, 2019
  40. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    632
    What particle material is VEG using as default ?
     
  41. id0

    id0

    Joined:
    Nov 23, 2012
    Posts:
    287
    This is all very nice, but where is parameter for light probes? I can't find any.
     
  42. thinksquirrel_lily

    thinksquirrel_lily

    Joined:
    Feb 8, 2011
    Posts:
    1,175
    Would it be possible to add an inverse lerp node? I find myself having to do multiple nodes for
    f(x, a, b) = (x - a) / (b - a)
    quite a bit.
     
    PaulDemeulenaere likes this.
  43. XRA

    XRA

    Joined:
    Aug 26, 2010
    Posts:
    182
    ***EDIT*** Nevermind, downgraded HDRP 4.8 to 4.6 and reimported the Visual Effect Graph folder, errors went away and remaining CS files imported into Unity.
    -------------

    getting some errors on adding the VFX Graph package with HDRP in 2018.3.2f1

    a few of these:
    "Shader error in '[System 1]Initialize': failed to open source file: 'Packages/com.unity.visualeffectgraph/Shaders/Common/VFXCommonCompute.cginc' at kernel CSMain at System.vfx(14) (on d3d11)"

    The cginc file is at the location, is VFXGraph 4.6 outdated? (do the package include paths need to be changed?)

    also getting errors finding PCache:

    Library\PackageCache\com.unity.visualeffectgraph@4.6.0-preview\Editor\Utilities\pCache\BakeTool\PointCacheBakeTool.Mesh.cs(375,9): error CS0246: The type or namespace name 'PCache' could not be found (are you missing a using directive or an assembly reference?)

    PCache.cs is in PackageCache\com.unity.visualeffectgraph@4.6.0-preview\Editor\Utilities\pCache

    but PCache.cs PointCache.cs & PointCacheAsset.cs do not show up within the Unity project view (despite all 3 being in that folder)
     
    Last edited: Jan 18, 2019
  44. Camelot63RU

    Camelot63RU

    Joined:
    Jul 20, 2014
    Posts:
    12
    I apologize, but what do I need to do in order to finally get a working "Visual effect graph" I have reinstalled Unity for three days already, set 4.3, 4.6, 4.8. I tried to install HDPR first, and already VFX after it, installed different versions and tried combinations of versions, I did not see any information on the forum, maybe I missed it. Please help to make it work for me. Now I have 2018.3.2f1, which version I need to install Unity and which versions of HDPR and VFX I need to install in it, and in what sequence. Thanks in advance, sorry for google translate.
     
  45. thinksquirrel_lily

    thinksquirrel_lily

    Joined:
    Feb 8, 2011
    Posts:
    1,175
    Did you set the render pipeline asset in graphics settings? (you can create one from the Assets menu)
     
    Camelot63RU likes this.
  46. Camelot63RU

    Camelot63RU

    Joined:
    Jul 20, 2014
    Posts:
    12
    Thank you so much, now I see that I missed
     
    thinksquirrel_lily likes this.
  47. konsic

    konsic

    Joined:
    Oct 19, 2015
    Posts:
    632
  48. SiriusRU

    SiriusRU

    Joined:
    Sep 29, 2018
    Posts:
    21
    Hey, can't figure this out, so can anyone hint - how to actually use event attributes?
    I tried it like this:

    Снимок.PNG
    And now I wonder.. How do I get spawnPos in VisualGraph?
    I want to spawn particles in a certain position via script. Right now, as workaround, I just set exposed parameter without eventAttribute and then call event.
     
    konsic likes this.
  49. Danua

    Danua

    Joined:
    Feb 20, 2015
    Posts:
    183
    Hi there, i've found some reaaaly strange bug with hdr color via vfx editor. after 7.69 hdr instensity. Looks like hdr color clamp bug.
    upload_2019-1-20_23-19-53.png
    upload_2019-1-20_23-20-17.png
    please hot fix.
    I need such intensity in terms of physical correct lighting units:
    100k for sun
    15k for sky
    and PPV
    dynamic range -9 to 9
    exposure compensate to 0.05
    also some video heare
     
    Last edited: Jan 20, 2019
    konsic likes this.
  50. thinksquirrel_lily

    thinksquirrel_lily

    Joined:
    Feb 8, 2011
    Posts:
    1,175
    Hm, seems I found a bug!

    Fixed timestep and deltaTime are being calculated incorrectly when maxDeltaTime is higher than deltaTime. (or am I missing something?)

    Typically, for fixed timesteps an accumulator loop would be run (exactly like Unity's 3D and 2D physics):
    • Accumulate frame deltaTime each frame
    • Once the fixed time has passed, simulate and subtract from the accumulated time multiple times, until it is again under deltaTime.
    • Don't simulate more than maxDeltaTime (to prevent simulation from blowing up when it isn't realtime)
    The key here is that each time particles are simulated, the same deltaTime is passed every single time.

    Instead it seems like this is happening:
    • Accumulate frame deltaTime each frame
    • Once the fixed time has passed, simulate the accumulated time once and reset the accumulator to zero (even though the last frame may have been longer than the target time)
    This makes for unstable simulation (since integration can happen larger timesteps), which can be noticable if you have a ton of non-linear forces being applied.

    The current workaround we have is to set both timesteps as the same, but this makes simulation slow down if we dip under our target time.

    Also a quick question: should I be making bug tracker #s for this (or Unity's other experimental features)?
     
    PaulDemeulenaere and elbows like this.