Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Fade in surfaces as they get lit up?

Discussion in 'Shaders' started by Benjamin_Overgaard, Nov 5, 2017.

  1. Benjamin_Overgaard

    Benjamin_Overgaard

    Joined:
    Jul 20, 2015
    Posts:
    17
    Hey, I'm trying to get surfaces to fade in, as they get lit up.

    If a surface gets lit up, I want to make it lerp to its lit color (based on a Lambert) over time, instead of applying it immediately. In my game, everything that is not in light, is completely black, so basically surfaces will go from black to e.g. gray over the course of a few seconds.

    For this to work, I need to increase the color value gradually over some frames. In order to do this, I need information about the fragments value over time. So how do I store the fragment's color from the last frame / read the fragment's color before it gets rendered (in an object shader, not post-processing)?
     
  2. brownboot67

    brownboot67

    Joined:
    Jan 5, 2013
    Posts:
    375
    Go get DoTween.

    Or, if you insist on doing your own code just do a coroutine that increments your variable every frame times whatever rate you want till it's done.
     
  3. Benjamin_Overgaard

    Benjamin_Overgaard

    Joined:
    Jul 20, 2015
    Posts:
    17
    I need to do this in a shader, per fragment.

    I wish that I could store a variable to increment for each fragment, but you can't save values between shader calls, unless the variable is controlled by a C# script, and it's not that simple in my case.
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,329
    Generally speaking saving state is "not how shaders work". Shaders get data, run calculations on them, and output data that in the case of vertex shaders gets passed onto fragment shaders, and for fragment shaders get rendered to a texture buffer.

    With dx11 there are ways to store data output by fragment shaders, but there's not a good way to read back data from a previous frame for "a pixel" since likely the camera moved and the pixel index and where or even what is being rendered no longer align. Basically what you're going to need to do is some form object space rendering. Basically you'll need to implement light mapping but in real time. That means needing every surface to be uniquely UV'd, and render out your objects to a render texture where that object's UVs are used for the clip space position rather than the transformed vertex position. If you care about shadows you'll need to render them out yourself or figure out how to copy the shadow maps from the lights (which isn't too hard, but getting the proper projection matrices is).