Search Unity

Question Question about HDR output and Wide Colour Gamut workflow

Discussion in 'General Graphics' started by RyanKeable, Jul 1, 2021.

  1. RyanKeable

    RyanKeable

    Joined:
    Oct 16, 2013
    Posts:
    62
    I have been trying to wrap my head around HDR and Wide Colour Gamut (WCG) workflows for the past few days and am having trouble fully grasping it all so I'd appreciate anyone with knowledge to input here!

    Firstly, untiy's support of HDR output seems rather limited as most of the support revolves around a single checkbox in the player settings. This doesn't allude to how Unity's is supporting HDR and I can't seem to find much infomation about that either.
    • What transfer functions does it use?
    • What standard does use? (HDR10 I would assume)
    • Does tonemapping take HDR output into consideration when adjusting the shoulder?
    • Why doesn't the HDR support HDR output so we can work in that space?
    Additionally, HDR is better served by a wide colour gamut and I don't fully understand how Unity supports WCG ICC profiles.
    • We have the ability to select different profiles based on our platform, but does this only effect the final output image or does the editor convert to the selected profile?
    • Is Unity colour managed on windows? So If I set a WCG in my colour profile in windows, is this supported in Unity?
    Finally, when it comes to colour piplines I am trying to understand best practice methods. For example, if I am authoring a texture in photoshop it is traditionally exported as an sRGB png, which would convert WCG into rec.709 I would assume? Once that is in engine, my understanding is that even if you were working in DCI P3 for iOS that texture cannot be accurately converted back?

    • so how do I author textures, models and other assets to fully support WCG and HDR output?
    • Should Unity read them in linear space rather than Gamma? As sRGB conversion in linear space assumes that they are in sRGB space (no longer in a wider gamut)

    There's a lot to unpack here and not enough documentation anywhere. Any help would be greatly appreciated!
     
  2. RyanKeable

    RyanKeable

    Joined:
    Oct 16, 2013
    Posts:
    62
    Ok so after a bit of an experiment with some texture authoring in photoshop, here are some answers to some questions.

    This is a screenshot from unity using the recorder, exported as an sRGB texture. In photoshop I have set my colour space to my monitor's colour profile (it covers 98% of DCI P3). I have added some different colour adjusments layers over the image (curves, balance and hue/sat) so for all intents and purposes the images have been re-coloured in photoshop within the current wide colour gamut.

    I've then exported them out as 4 different png files:


    1. Exported no sRGB conversion no embedded profile
    2. Exported sRGB conversion and embedded profile
    3. Exported no sRGB conversion no embedded profile
    4. Exported with sRGB conversion and embedded profile

    upload_2021-7-1_14-35-53.png

    sRGB has a tendency to oversaturate as expected
    Embedding the profile adds further saturation but looks most accurate in the case of 2. (no sRGB, embed profile)

    So it appears Unity is decoding the profile correctly in some way shape or form. I still do not know if that profile is carried over to the GPU as is or if it's converted somewhere along the way to sRGB.
     

    Attached Files:

  3. RyanKeable

    RyanKeable

    Joined:
    Oct 16, 2013
    Posts:
    62
    Does anyone else have any input on these matters? surely I cannot be the only one struggling to work this all out...
     
  4. bitinn

    bitinn

    Joined:
    Aug 20, 2016
    Posts:
    961
    I am 1 year late but I would love to know this for sure, just how does Unity transfer sRGB (rec.709) to wide gamut display is a mystery. And I don't know who to ping on this issue. (note this is quite different from HDR tone mapping, even if they can achieve similar objective)
     
  5. bitinn

    bitinn

    Joined:
    Aug 20, 2016
    Posts:
    961
    Actual, I have took some time to read how Apple and Nvidia are doing this and it looks like most of the time they provide abstraction to support output with negative value and value over 1, through extended sRGB format (Apple) or floating point format (Nvidia driver on Windows). The driver handles the final transform to display on screen (the physical hardware).