Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

What's the point of TV's having HDR when its implemented through software?

Discussion in 'General Graphics' started by iSinner, Aug 3, 2019.

  1. iSinner

    iSinner

    Joined:
    Dec 5, 2013
    Posts:
    201
    The engine produces an HDR image, if then tonemaped, the monitors can properly display it, thus we can see an image that doesn't have color "cut off" by being too white.

    Now as far as i understand, TV's are trying to display HDR w/o tonemapping, is this correct? if yes, then why? what's the point? wouldn't tonemapping solve the issue through software, rather than trying to solve it hardware wise?

    I am kinda confused by this hardware HDR tech TV's are trying to do, when it has been done in 2006 by halflife 2 via software.

    Either i am missing something, or this technology is rather useless.
     
  2. fffMalzbier

    fffMalzbier

    Joined:
    Jun 14, 2011
    Posts:
    3,276
    HDR is bringing a larger range of lighting information to you , with the new hardware that is bright engnoth you can finally see the "more true" illumination values and not have the software remap it to the smaller space a non hdr Monitor can display.
     
  3. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,329
    HDR means High Dynamic Range. Really this just means "more than the range of SDR" (aka Standard Dynamic Range). Both ranges are totally arbitrary, though SDR was defined as between 0 and 100 nits as it was the common brightness for CRT tvs at the time the standards were written. Most modern computer monitors are between 300 and 400 nits, and your basic non-HDR LCD TV from the last decade are usually a little brighter, up to 500~600 nits at their brightest settings. This basically means all video content for the last 20 years has been designed to look correct on displays that are significantly less bright than the modern devices we've been viewing them on.

    You've got the right idea on tonemapping. Tonemapping's exists to map HDR images to an SDR image without losing all the detail in the brightest areas. If you just display an HDR image on an SDR display without that, any bright areas of the image would be blown out and appear white, and that would be most of the image. With tonemapping you can rescale the image's brightness range, as well as adjust the curves of the brightest parts. There are many ways to go about this, like just rescaling the whole image, or selectively scaling parts of the image differently from others. It can also be used to make dark scenes brighter.

    HDR video may still apply some tonemapping, but it can have a greater range of brightness within a single image. A good example would be a sunny outdoor day. Most of the image may look almost exactly the same between the SDR and HDR versions of the same video as the image is still being tonemapped to be at a comfortable viewing brightness, but the extremely bright areas can actually be super bright, like clouds, or specular highlights, or the sun itself. These are things that would generally get blown out in the SDR image. The easiest way to think of it is if an SDR image has a range of "0.0 to 1.0", then an HDR image can have a range of "0.0 to 1000.0", but most of the "0.0 to 1.0" range from both look the same.

    HDR tvs, or displays that can accept an HDR video signal, will take that HDR image and apply tonemapping to it in real time to match that displays brightness output. No consumer TV out there can yet display the full range of an HDR video signal, so instead they use tonemapping to gracefully handle remapping the >1.0 ranges, attempting to retain some detail and letting the rest blow out. This means if you have a high end HDR tv, the sun can be really, really bright. Nowhere near as bright as the real sun (the brightest of bright consumer tvs are ~2,000 nits, but the actual sun is ~1.5 billion nits), but noticeably brighter than the rest of the scene. If you have a cheaper HDR tv or monitor, it probably won't look any different than before, it just means it can accept and display the new video standard, tonemapping it down to whatever brightness level it is capable of. While not talked about much, it also means more detail in the darker parts of the screen as a side effect of the HDR video formats having a greater range and precision of values it can represent.

    Basically, HDR tvs and HDR video is trying to make use of that additional brightness that modern displays have to actually display a greater range of values, rather than just displaying the original image brighter.


    HDR rendering means something a little different. Technically it's the same idea of storing more data than just "0.0 to 1.0", and then tonemapping the resulting image. But just because something does HDR rendering doesn't mean it outputs HDR video. Unity for example does not yet support HDR displays / video output, only HDR rendering. So while it internally has all the HDR data, it always outputs an SDR image in the end, tonemapped or not.
     
  4. iSinner

    iSinner

    Joined:
    Dec 5, 2013
    Posts:
    201
    Very happy to receive such a lengthy explanation, helps a lot.

    So another way to think about it, is something analogous to increasing the number of bits to represent more colours(or more shades of gray in a single color channel).

    HDR TVs are capitalizing on their higher range of brightnesses to increase the number of displayed brightness levels, did I get it correctly?

    AFAIK the human eye sees brightness non linearly, hence why displays do gamma correction, so that when brightness by data is at 0.5, we could see it as 0.5 by perception after gamma correction.

    And going by how gamma correction is done(it expands the darker shades and contracts the brighter ones if you imagine a linear to gamma corrected gradient), our eyes poorly perceive the brighter shades, so the HDR TVs in theory should greatly improve the PERCEIVED amount of shades that are gray to black level, and not so much the gray to white ones, is that correct?

    I am also interested how eye adaptation(from HDRP for example) factors into it. It basically adjusts the width of the range it needs to tonemap depending on the brightest and darkest texels that are available in the screen buffer(or where ever it is).

    So for example, if in a scene i am looking in such a way that i see a bright sun and a dark cave at the same time, the range of brightness ranges from very dark to very bright, let's say from 0 brightness to 8(where 1 is white in SDR, and anything above 1 is going in HDR range), and if then tonemapping has to map the available number of shades(lets say 256 shades) to a range of 0 to 8 brightness, so every shade will cover about 8/258(or 1/32 or 0.03125) range of brightness which isn't that great because it will go in steps of 0.03125, if the distribution of shades is done linearly over the range of brightness levels that needs to be covered.
    Here HDR on TV's will be more noticeable because they can display more shades compared to SDR TV's.

    But if i am looking only in a cave where the brightness ranges from 0 to 0.3, then the tonemapping will have to map, lets say the same 256 shades, to a brightness that ranges from 0 to 0.3, which is 0.3/256(or 0.001171875) which is way better because each brightness step will be way smaller than in the case of 0 to 8 brightness range, which means that while technically HDR TV's will still have more brightness levels than SDR ones, it won't be as much perceivable(or at all?), practically it isn't that noticeable/perceivable, right?

    So my question is, does eye adaptation undermine the HDR technology(not technically, but in terms of eye perception because that's what matters) that is present in HDR TV's by having a dynamic brightness range that needs to be tonemapped?

    Also, afaik HDR10 tech, which as i understand it, is a tech that has a fixed range of brightness to tonemap(as if w/o eye adaptation), while HDR10+ is a tech that can change the range of brightness that it has to tonemap on a frame by frame basis, which is like having eye adaptation effect.

    So in a way, i am wondering if HDR10+ makes the HDR on TV's matter only in images where the needed range of brightness to be displayed is very wide, as in being 0 to 8, and not matter as much for the images whose range of brightness is in 0 to 0.3, am i correct here?
     
    Last edited: Aug 7, 2019
  5. cubrman

    cubrman

    Joined:
    Jun 18, 2016
    Posts:
    409
    Ok, recently watched a Cyberpunk HDR video from YouTube on my friend's modern HDR tv and it's a frickin' game changer. The difference is not just immediately visible - it's JARRING and jumping right at you throught the screen. It's not just brightness, not at all - it's some sort of magical clarity between edges of colors. The most amazing part was - there are areas in Cyberpunk which have ugly lighting on my old monitor - basically every nightclub. I hate those areas. However, when you watch them on and HDR TV they look INSANE, and you wanna LIVE THERE FOREVER.

    Here is the video I watched:


    Now here is the main question:
    how do you enable HDR mode in HDRP? So far I see that even turning off Tonemapping keeps the image in SDR, that is the HDR to LDR conversion still applies, am I wrong?
     
  6. c0d3_m0nk3y

    c0d3_m0nk3y

    Joined:
    Oct 21, 2021
    Posts:
    651
    Hi cubrman, see "Enabling HDR Output" here https://docs.unity3d.com/Packages/c...s.high-definition@14.0/manual/HDR-Output.html

    You have to distinguish between HDR rendering and HDR output. HDR rendering means that you are using a floating point render target which may or may not be converted to SDR at the end. HDR output means, that the backbuffer is floating point too and that HDR is enabled in Windows.

    And yes, I have an HDR TV with over 2000 nits of peak brightness and it looks amazing!
     
    Last edited: Oct 2, 2023
    Ryiah and cubrman like this.
  7. cubrman

    cubrman

    Joined:
    Jun 18, 2016
    Posts:
    409
    Bless your soul, kind man! I tried googling this but was unsuccesful, thanks for providing the link! And yes, I DO understand the difference. I have worked with shaders for quite some time and that was one of the reason why I was initially skeptical when I heard about an "HDR TV". "Loosers" - thought I - "Games had that for at least a decade. Oh how blind I was)
     
  8. cubrman

    cubrman

    Joined:
    Jun 18, 2016
    Posts:
    409
    @c0d3_m0nk3y btw which TV is that? I'm currently shopping for one for myself. If you can advice me a 42 inch one I would be very grateful)
     
  9. cubrman

    cubrman

    Joined:
    Jun 18, 2016
    Posts:
    409
    @c0d3_m0nk3y yeah, one more question: is it worth chasing 2000+ nits if I can buy a TV with 850?

    Oh I see - the one with 850 max nits is an OLED which has its own advantages. I guess it's a tradeoff at the moment.
     
    Last edited: Oct 2, 2023