Search Unity

"Motion" blur and frame rate independency

Discussion in 'Shaders' started by bartofzo, Mar 23, 2018.

  1. bartofzo

    bartofzo

    Joined:
    Mar 16, 2017
    Posts:
    151
    Hi,

    So I'm writing this shader that takes the old pixel value into account when calculating the new pixel value like so:

     finalColor.rgb = lerp(newColor.rgb, oldColor.rgb, blurAmount);


    Where blur would be a value like 0.95.

    The thing I can't wrap my head around is how to make this framerate independent. Say the value of 0.95 works for 60 FPS.
    If the framerate is lower, say 30 FPS, the amount of blur should be less, but simply dividing by a deltaTime scale value would result in half the blur value. Which is basically invisible. So it doesn't scale in a linear fashion.
    Any ideas on a good way to calculate the blur value based on framerate that will result in the same optical effect? Thanks.
     
  2. Zee_pso

    Zee_pso

    Joined:
    Nov 12, 2016
    Posts:
    31
    If the frame rate is less, shouldn't there be more blur (hence multiplying by unity_DeltaTime) instead? If it's indeed a motion blur, a lower frame-rate (30) would imply the screen moved more in each individual frame compared to a faster frame rate (60).

    Or, do you mean a lower frame rate makes the screen too obscured, and you want a similar blur to the higher frame rate? An easy fix would just cap the max blur amount, but a lower frame-rate would hit it much quicker (frame count wise). Otherwise, how are you calculating the blur amount (based on the camera's speed, or some other way)?
     
  3. bartofzo

    bartofzo

    Joined:
    Mar 16, 2017
    Posts:
    151
    What I'm trying to achieve is not exactly a motion blur, but more of an image effect that needs to move smoothly from one state to the other, with the same perceived speed.

    Capping would be an option, but mathematically speaking there must be a better way. Also, what if somebody's playing at 120 FPS? Then the max blur value should be higher, but how much?
     
  4. Zee_pso

    Zee_pso

    Joined:
    Nov 12, 2016
    Posts:
    31
    So something akin to a screen transition? There's a few ways I can think of.

    You could just pass in the blur amount from a coroutine, that way, you can keep track of and calculate how much the blur should be over time, and just pass it to the shader each frame.

    Another way would be to pass a timestamp to the shader on the start of the transition, and just have the blur amount be calculated by the time negated by the timestamp. For example, if you want it to transition over five seconds, (_Time.y - _TimeStamp) * 5 should work.
     
  5. bartofzo

    bartofzo

    Joined:
    Mar 16, 2017
    Posts:
    151
    Not exactly a screen transition. It's a little complicated but basically it's a generated visual that uses the previous frame values to blend into the next frame. Therefore I can't have a beginning frame and end frame and simply interpolate between these two. It has to work with only the previous frame and deltaTime.

    Not exactly sure what you mean by this? In what way would passing in the blur amount from a coroutine differ from simply using deltaTime?
     
  6. Zee_pso

    Zee_pso

    Joined:
    Nov 12, 2016
    Posts:
    31
    If you wanted the blur to build up overtime (such as in a transition), delta time would only give you the value between the previous and current frame, and the blur amount would reset back to its original value. So if you wanted to hold onto the previous frame's blur value and increase it, you'd have to hold onto it in a script, and pass it in. But it sounds like this isn't what you wanted, and I misunderstood the question.

    Would the "final color" become the "old color" in the next frame? So you're always lerping the image into the new color each frame, giving a constant blur? If so, I'm guessing the difference between the color values at a lower frame rate is greater then a higher one, so the blur is more pronounced, correct?

    If that's the case, lerp expects a value between 0 and 1, which if you multiply it by delta_time, will be a very small value, and dividing it would be a very large value, but you could try negating it by an offset value instead. [blurAmount - (offset * unity_DeltaTime)], but you'd have to play around to find a good offset value, and how well it would scale.

    Otherwise, if that's not what you wanted, do you have an example of the effect working as intended? It can be hard trying to understand visual problems from text without an example.
     
  7. bartofzo

    bartofzo

    Joined:
    Mar 16, 2017
    Posts:
    151
    Thanks for your help. But I think I've got it figured out.
    For those interested for what I think is a solution to my problem:


    float blurAdjust = 1 / ((1f/60f) / Time.deltaTime);
    float blurAmount = Mathf.Pow(BLUR_AMOUNT_60FPS, blurAdjust);
    currentMaterial.SetFloat("blurAmount", blurAmount);


    Basically, if we're running at 30 FPS, blurAdjust will become 2.
    Let BLUR_AMOUNT_FPS be 0.9. Then 0.9^2 is 0.81. Which seems like a reasonable value.

    Kind of logical when I think about it. Because in a 60FPS scenario, two frames would have passed, so the total blur would be 0.9 * 0.9.

    Now if we're runnning at double the framerate (120) then 0.9^0.5 is 0.948.... Which I believe is also correct, because going back from 0.948... when halving the framerate results in a value of 0.9.

    Happy coding :)
     
  8. Zee_pso

    Zee_pso

    Joined:
    Nov 12, 2016
    Posts:
    31
    Glad you got it sorted out!