Search Unity

Is _Time.y equal to Time.time on the CPU or are they not in sync?

Discussion in 'Shaders' started by TheCelt, Sep 4, 2019.

  1. TheCelt

    TheCelt

    Joined:
    Feb 27, 2013
    Posts:
    742
    Hi

    I am running an algorithm on the GPU but i also execute it on the CPU at certain sample points for physics.

    I run the CPU algorithm via Time.time and the GPU uses _Time.y.

    Are they in sync? Or can they be out of sync? Is there a way to keep time in sync between the two?

    I don't mind a slight amount of difference in sync, because my physics only runs at like 10 FPS and doesn't need to be totally perfect. But it is not clear how out of sync they can be to know if i can rely on it.

    If i can't i need some way to sync time between the time more accurately. Whats the best approach ?
     
  2. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,025
  3. TheCelt

    TheCelt

    Joined:
    Feb 27, 2013
    Posts:
    742
    Right so they aren't sync'd. But then what is the right way to keep them in time sync?
     
  4. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,025
    You can pass in your own variable.
     
  5. mouurusai

    mouurusai

    Joined:
    Dec 2, 2011
    Posts:
    350
    Why "Time.timeSinceLevelLoad" and "_Time.y" not in sync, it's a bug?
     
  6. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,025
    We were talking about "Time.time", not "Time.timeSinceLevelLoad"
    @PixelizedPlayer you can probably use "Time.timeSinceLevelLoad" in your CPU part as well :)
     
  7. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    _Time.y != Time.time

    _Time.y == Time.timeSinceLevelLoad
    https://docs.unity3d.com/ScriptReference/Time-timeSinceLevelLoad.html

    Or at least it should be. I've run across a few cases where it doesn't in the past on certain platforms, but I suspect this was a bug.

    I've seen some people say they use Shader.GetGlobalVector(_Time); successfully to avoid any kind of problems with being out of sync, though I have not personally tried it.

    Also using a custom time value is underrated as it means you can pause your game world and menu separately if you want to have shader animated elements in the UI.
     
  8. TheCelt

    TheCelt

    Joined:
    Feb 27, 2013
    Posts:
    742
    When you say using a custom time value, do you mean just passing a time value to a property of the shader every frame ? Is that a performant option? I was led to believe passing data to/from the GPU is a performance problem and when possible, best to avoid it.....
     
  9. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Yep.
    Shader.SetGlobalFloat("_CustomTime", myTime);

    Passing data from the CPU to the GPU is pretty fast. Think about it this way. A single particle system with 1000 particles is sending at least a megabyte a second of data to the GPU. Every dynamically batched object is doing that much or more as well. Even just having a single moving object is several times that much data. An extra float isn't a big deal.

    Getting data back from the GPU to the CPU is a big bottleneck as that's simply not how the system was designed.
     
    Last edited: Sep 4, 2019
    mothercow5 and hippocoder like this.
  10. TheCelt

    TheCelt

    Joined:
    Feb 27, 2013
    Posts:
    742
    Ah i see. Thanks for explaining. If only the system was designed for to and from data transfers, the potential for compute shaders would be immense for many run time algorithms for the CPU side.
     
  11. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    You would have to be talking to GPU/hardware developers at that point :)
    But you can totally do this if you don't mind a small bit of latency before you get the results...
     
  12. TheCelt

    TheCelt

    Joined:
    Feb 27, 2013
    Posts:
    742
    True lol. I shall write them a strongly worded letter :p
     
  13. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    These kinds of things are possible (and done) on consoles since they used shared memory. You can have the GPU do something and read the results on the CPU almost instantly. Not possible on desktop since the GPU has its own RAM. You transfer a bunch of data to the GPU and the GPU works away on that data, but to get data back you need to stall the GPU to make sure the data doesn't change while it's being transferred back from from the GPU to the CPU RAM. Technically the way modern rendering systems are designed the actual rendering happens a frame delayed from what the CPU is doing. So often times to get data back from the GPU means waiting two frames. This isn't always true, as the GPU can often be idle between frames, so it's possible to have it do some compute work and get back the data in a ms or two, but it's hard to rely on that on desktop.
     
    AlterHaudegen likes this.
  14. Samhayne

    Samhayne

    Joined:
    Jun 15, 2009
    Posts:
    45
    As Google brought me here...
    I also wanted to get times from shader world and script world which are in sync. But...
    • Shader.GetGlobalVector("_Time") just returned an empty vector. Or some weird values for one frame when hitting Unity Editor's play button
    • _time and Time.timeSinceLevelLoad did NOT match. I always had a slight offset which kept changing every time I hit play.
    What indeed seems to work is what bgolus proposed - setting a custom value each frame.
    Code (CSharp):
    1.  
    2. float myTime = Time.time;
    3. Shader.SetGlobalFloat("_CustomTime", myTime);

    Unity... why do you have to make life so hard for us?
     
  15. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Depend on when you call GetGlobal for a value set by Unity's internal rendering code you may get the current frame, the previous frame, or junk. Some values might be set early in the frame, some just before rendering starts, and some might never actually get set in a way user-land c# can access. Generally you should never expect GetGlobal to return anything of use for any value but ones you've set yourself.

    I've tried using
    Shader.GetGlobalVector("_Time")
    twice since my previous post. In one project I think I was always getting the previous frame's data, and in the other I only every saw zeros. I don't know why they were different. I don't know why other people got useful values. Could be different versions of Unity changed how things worked. Could be when I was checking it. Could be I'm remembering it wrong. I don't ever actually use it, and both times was just to see what values I got. I do however sometimes rely on
    _Time.y
    and
    Time.timeSinceLevelLoad
    matching, and in the cases I've used it it has worked. But again it might just be the Unity versions or other factors in how the project was setup allowed for that. Anything I need to absolutely be perfect I always use a custom time, or drive material values from script directly.
     
    AlterHaudegen likes this.
  16. Samhayne

    Samhayne

    Joined:
    Jun 15, 2009
    Posts:
    45
    Very odd... just to prove it... 2019.2.10f1 (with an Android target) gives me this.

    1) Using _Time.y / Time.timeSinceLevelLoad:

    Code (CSharp):
    1. // Wave vertex shader:
    2. v2f vert (appdata v)
    3. {      
    4.         v.vertex.y = sin((v.vertex.x + _Time.y) * _WaveAmplitude) * _WaveHeight;
    5.         o.vertex = UnityObjectToClipPos(v.vertex);;
    6.      
    7.         return o;
    8. }
    9.  
    10.  
    11. // Sphere positioning code:
    12. private void Update()
    13. {
    14.        float yPos = Mathf.Sin((Water.transform.InverseTransformPoint(this.transform.position).x + Time.timeSinceLevelLoad) * WaveAmplitude ) *  WaveHeight;
    15.         this.transform.position = new Vector3(transform.position.x, yPos, transform.position.z);
    16. }
    17.  

    The offset also changes every time I hit play.



    2) Setting the global time value....

    Code (CSharp):
    1. // Wave vertex shader:
    2. [...]
    3. float _CustomTime;
    4. [...]
    5.  
    6. v2f vert (appdata v)
    7. {    
    8.         v.vertex.y = sin((v.vertex.x + _CustomTime) * _WaveAmplitude) * _WaveHeight;
    9.         o.vertex = UnityObjectToClipPos(v.vertex);;
    10.      
    11.         return o;
    12. }
    13.  
    14.  
    15. // Sphere positioning code:
    16. private void Update()
    17. {
    18.        float myTime = Time.time;
    19.        Shader.SetGlobalFloat("_CustomTime", myTime);
    20.  
    21.        float yPos = Mathf.Sin((Water.transform.InverseTransformPoint(this.transform.position).x + myTime) * WaveAmplitude ) *  WaveHeight;
    22.         this.transform.position = new Vector3(transform.position.x, yPos, transform.position.z);
    23. }
    24.  



    =======

    Regarding Shader.GetGlobalVector("_Time"):

    Code (CSharp):
    1. private void Update()
    2. {
    3.        Vector4 globalShaderTime = Shader.GetGlobalVector("_Time");
    4.        Debug.Log("shader timeV: " + globalShaderTime + " shader _Time.y: " + globalShaderTime.y + " timeSinceLevelLoad:" + Time.timeSinceLevelLoad);
    5. }
    would return this...

    upload_2020-6-16_14-0-7.png

    Just zeros.
     

    Attached Files:

    Last edited: Jun 16, 2020
  17. AlterHaudegen

    AlterHaudegen

    Joined:
    Jul 8, 2013
    Posts:
    28
    Using a custom timer like the always reliable @bgolus suggested was the only thing that worked for me, since whichever time variable I used there was always some issue with it. Mainly, it seemed to do completely different things between Editor and Player buids...
     
  18. apaer

    apaer

    Joined:
    Jan 23, 2018
    Posts:
    9
    Mmm.. isn't passing the time from CPU to control GPU animation or dynamic objects defeating the whole idea of modern GPU ? Back to the past, we had jerky days, thanks to CPU and GPU frequency mismatch accumulation and inconsistencies. All those arguments Fixed Update vs Updade, delta time vs average.... So actually it is a big deal to leave timing within GPU which is perfectly synchronised with user display, unlike CPU.
     
    Last edited: Dec 10, 2022
  19. warpfx

    warpfx

    Joined:
    Feb 10, 2020
    Posts:
    14
    As I had the same problem and found inconsistency with shader's timing between ShaderGraph's compiled shader vs C#'s
    Time.timeSinceLevelLoad
    I've looked deeply into the Unity's source code and found that:

    Code (CSharp):
    1. private void InitRenderGraphFrame(RenderGraph renderGraph, ref RenderingData renderingData)
    2. {
    3.     using (var builder = renderGraph.AddRenderPass<PassData>("InitFrame", out var passData,
    4.                Profiling.setupFrameData)) //TODO rendergraph maybe add a new profiling scope?
    5.     {
    6.         passData.renderingData = renderingData;
    7.         passData.renderer      = this;
    8.  
    9.         builder.AllowPassCulling(false);
    10.  
    11.         builder.SetRenderFunc((PassData data, RenderGraphContext rgContext) =>
    12.         {
    13.             CommandBuffer cmd = rgContext.cmd;
    14. #if UNITY_EDITOR
    15.             float time = Application.isPlaying ? Time.time : Time.realtimeSinceStartup;
    16. #else
    17.                 float time = Time.time;
    18. #endif
    19.             float deltaTime       = Time.deltaTime;
    20.             float smoothDeltaTime = Time.smoothDeltaTime;
    21.  
    22.             ClearRenderingState(cmd);
    23.             SetShaderTimeValues(cmd, time, deltaTime, smoothDeltaTime);
    24.  
    25.             data.renderer.SetupLights(rgContext.renderContext, ref data.renderingData);
    26.         });
    27.     }
    28. }
    After I've used that block:

    Code (CSharp):
    1. #if UNITY_EDITOR
    2.     float time = Application.isPlaying ? Time.time : Time.realtimeSinceStartup;
    3. #else
    4.     float time = Time.time;
    5. #endif
    Everything started to work.
     
    halley and nasos_333 like this.
  20. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,360
    Hi,

    Is this the time passed to the shader side and works ?

    It would be great if could have an official answer also on how to pass our own time through Shader.GetGlobalVector("_Time"), without having to define new variables, so can be globally applicable.

    E.g. i assume there is an injection point from Unity side, that can be overwritten by our own global set if timed after that declaration.
     
  21. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,025
    There's no way to do this. This data is set by the engine internally and any data that is set from C# it will be overwritten.
     
    nasos_333 likes this.
  22. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,360
    i see, thanks. Is there any plan to have this as an override for shader use in later versions ?

    Thanks
     
  23. aleksandrk

    aleksandrk

    Unity Technologies

    Joined:
    Jul 3, 2017
    Posts:
    3,025
    I haven't seen a user request like this. And given there's a way to do it by simply adding a different variable, it will likely either not get implemented or end up very low on the prio list.
     
    nasos_333 likes this.
  24. nasos_333

    nasos_333

    Joined:
    Feb 13, 2013
    Posts:
    13,360
    Thanks