Search Unity

Point lights and attenuation in shader

Discussion in 'Shaders' started by Theformand, Jan 7, 2020.

  1. Theformand

    Theformand

    Joined:
    Jan 2, 2010
    Posts:
    271
    So, I have a question about implementing point light attenuation. I'm basically extending the Standard shader to do a muzzle flash effect on the environment when the player shoots (to avoid more realtime lights). I'm doing this by simulating a simple version of a point light from the player position when they shoot. However, I can't seem to get it to work correctly. It doesnt light up at all on the floor below me, only walls. They both use the same shader. And the attenuation seems completely off, I cant figure out how to implement radius. Could anyone take a look? Shader link : https://pastebin.com/MuUadxjS

    Video of what it looks like :
     
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    I suspect a big part of the problem is you've defined the light position as a
    Color
    property instead of a
    Vector
    property. Color properties will get color correction applied, so the value you set as the color won't necessarily be the value the shader receives. Make sure you use this in the shader for defining the property:
    Code (csharp):
    1. _FlashPos ("Flash Position",Vector) =(0,0,0,0)
    And use
    SetVector()
    to set it from script.

    Even once you fix the above issue, you're still going to see the lighting on the walls a lot stronger than the floor, though you should at least see it on the floor now. That's just how real world lighting works. You'll see the same if you put a very large "real" light in your scene. You may want to play with a half lambert model instead of the lambert you're using now. Basically instead of:
    Code (csharp):
    1. float nl = max(0, dot(IN.worldNormal,lightDir));
    use:
    Code (csharp):
    1. float nl = max(0, dot(IN.worldNormal,lightDir) * 0.5 + 0.5);
    One other thing to know is Unity's default attenuation curve is a bit curious. It's not anything remotely based on what a real world light, it's just something that looked nice. I posted about how their curve works here:
    https://forum.unity.com/threads/light-distance-in-shader.509306/#post-3326818

    The falloff you are using has an infinite range, which is more similar to the real world, and means the "range" is simply adjusted by the brightness of the light. Unity's SRPs and Unreal (optionally) uses falloff curves like this, but then uses a range limit on top of that to ensure the lighting ends. You can do that however you want. Unity's LRP uses a linear distance clamp where the last 80% of the range multiplies an inverse square falloff by 1.0 to 0.0 at the extent of the range. That's easy enough to implement:
    Code (csharp):
    1. att *= saturate((1 - dist / _FlashRange) / 0.2);
    Use which ever you like the look of better.
     
  3. Theformand

    Theformand

    Joined:
    Jan 2, 2010
    Posts:
    271
    Thank you for the insightful post!
    I will mess around with it tomorrow, and see if that helps. I didnt know about the Color property built-in corrections, thats probably a big part of my issue. I think I tried the half lambert already, but I'll give it another go with your suggestions in mind.