Search Unity

Question Accessing physical lighting values (luminance/illuminance) within shader/script

Discussion in 'High Definition Render Pipeline' started by erickson656, Jul 14, 2022.

  1. erickson656

    erickson656

    Joined:
    Aug 17, 2018
    Posts:
    5
    Hi everyone,

    I'm working on a lighting simulation where I need to be able to calculate luminance contrast between different points in the virtual scene. I'm currently using the HDRP, so I can define my lighting parameters using the physical lighting units lux/nits, however I am having trouble figuring out how to access these values within shaders or scripts for various pixels in the main cameras view.

    From what I understand, each pixel in the camera view will have an HDR luminance value, where during tonemapping the value is normalized to be between 0 and 1. If this is correct, is there a way to access this value prior to normalization in a shader or c# script? I feel like this should be possible with a custom post processing shader that happens earlier in the pipeline, but in reading the docs and forums, I am not making any progress in figuring out how to do this.

    I'd appreciate any tips you may have. Thanks in advance.
     
  2. SebLazyWizard

    SebLazyWizard

    Joined:
    Jun 15, 2018
    Posts:
    233
  3. erickson656

    erickson656

    Joined:
    Aug 17, 2018
    Posts:
    5
    Awesome! This is exactly what I was looking for. I'll try them out this week. Thanks!
     
  4. SebLazyWizard

    SebLazyWizard

    Joined:
    Jun 15, 2018
    Posts:
    233
  5. erickson656

    erickson656

    Joined:
    Aug 17, 2018
    Posts:
    5
    Hi! Thanks again for the suggestion. I looked into it and noticed that in the Luminance() function it is performing a dot product between the RGB value and the sRGB relative luminance between the 3 color channels. If I understand correctly, this will yield a luminance associated with how bright the output pixel will be perceived to be on the user's display, where a value of 1 means the pixel will be max luminance output of the display (say 200 cd/m^2). In this case it looks like I would simply scale the output luminance by 200 to yield the output luminance of the pixel on the display.

    The simulated luminance value of the pixel within the virtual environment may be several orders of magnitude higher than this, especially for sunny outdoor type scenes (say 1,000,000 cd/m^2). Is there a way to appropriately scale the value output by the Luminance() function to approximate this value instead? Is this perhaps what you meant when you mentioned scaling by the exposure value?

    EDIT:
    After some experimentation, it seems that if I divide the value returned by Luminance() by the value returned by GetCurrentExposureMultiplier(), it returns values that appear to be in the expected range of luminance values for my scene lighting. Can anyone confirm that this would be the case?
     
    Last edited: Jul 25, 2022
  6. SebLazyWizard

    SebLazyWizard

    Joined:
    Jun 15, 2018
    Posts:
    233
    It has nothing to do with your display output at all.
    Luminance(outColor) / GetCurrentExposureMultiplier() gives you the real scene luminance of a given pixel.

    Make sure that your post process injection point is not later than "After Post Process Blurs", so the result wont be affected by tone mapping.
     
    erickson656 likes this.
  7. jake1025

    jake1025

    Joined:
    Mar 1, 2018
    Posts:
    5
    How did you go about implementing this? I'm facing a similar issue where I need to access the physical luminance values on screen, which it seems like you do here, but I'm at a loss on how to put that together.

    Any help would be appreciated, thanks.
     
  8. erickson656

    erickson656

    Joined:
    Aug 17, 2018
    Posts:
    5
    For what I was doing, I just needed to be able to access the luminance values within a shader or post process effect, as I wanted to customize how an overlay texture applied to the camera view would blend with the render texture of that camera based on luminance in the render texture. I ended up making a custom pass to do this. If you are doing something similar, then take a look at the attached shader, keeping in mind I was using HDRP.

    Sorry the code is a mess, but the lines of interest are part way down in the FullScreenPass function:

    float exposure = GetCurrentExposureMultiplier();
    float sceneLum = Luminance(finalColor) / exposure;

    SceneLum holds the luminance value for the pixel the shader is currently working on. I later use these values to calculate a contrast between the scene luminance and the luminance of another input texture.

    If you are hoping to access the luminance values outside of a shader, it will be a bit trickier. I think you would have to store the luminance values into a new texture within the shader. Then you could potentially access these in a script that had a reference to that texture.
     

    Attached Files:

  9. jake1025

    jake1025

    Joined:
    Mar 1, 2018
    Posts:
    5
    Funny enough, it sounds like we have a very similar use case. This seems like exactly what I was missing. At the moment I was just using a shader on a UI panel to make my overlay, but evidently a custom pass would allow access to the lighting information via that render texture.

    It seems like you're working with the luminance via the rgb values of that render texture post normalization, then adjusting for the camera exposure. Did you happen to find a way to directly access the lighting information before its normalized / applied to the color? My concern is that If we have a white pixel in the render texture that we'll know what the minimum luminance would be, but the physical lighting value could be much larger if it was clipping the exposure range of the camera.

    In any case, thank you for the help.
     
  10. erickson656

    erickson656

    Joined:
    Aug 17, 2018
    Posts:
    5
    You need to consider what injection point to use for the custom pass shader. I placed mine in the "before post-process" injection point, which ensures it is called prior to tonemapping, which normalizes the luminance values. As long as you place it there, the luminance values will be non-normalized.
     
    jake1025 likes this.