Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.

Mobile VR Water Shader for Oculus Quest/Go Single Pass Forward Rendering

Discussion in 'VR' started by ROBYER1, Sep 4, 2019.

  1. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,399
    Has anyone got any tips or examples of a working water shader for the Oculus Quest/Go or similar Mobile VR platforms? I was writing my own but had issues getting depth from the camera in single pass (maybe for obvious reasons). Also tried creating a water shader in shader graph on LWRP which only rendered in one eye(!).

    I am currently working with built-in renderer for the time being so anything that works with that would be neat, am happy to share any findings I have!
     
  2. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,399
    This is currently what I get when I add a water material that uses Reflection and Refraction to my scene, see how the textures per-eye are different and the reflections are also offset.

    Apparently this is a bug with Unity and I will report it today

    reflectionwrong.png
     
    JoeStrout likes this.
  3. LiminalTeam

    LiminalTeam

    Joined:
    Sep 17, 2019
    Posts:
    6
    Did you get any luck with this?
     
    jacodavis likes this.
  4. jacodavis

    jacodavis

    Joined:
    Sep 18, 2019
    Posts:
    21
    Also still looking, Aquas water is BAD, reflections are wrong
     
  5. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,399
    I was able to use Shader graph in UniversalRP to make wonderful looking water, but I could never get anything to look right or work in Built-in renderer. I have reported all my issues as bugs to Unity so hopefully they can eventually fix it. The asset store creators of water materials I spoke to reported their issues to so it's either a case of moving to UniversalRP or waiting for a fix in built-in renderer.

    Personally for us we cannot use UniversalRP(LWRP) yet as Fixed Foveated Rendering on the Oculus quest is broken with it another bug!
     
  6. darryl_wright

    darryl_wright

    Joined:
    Jul 8, 2019
    Posts:
    18
    Ugh! This is why it's so frustrating to be a developer right now. It seems as though with Unity they want you to move forward with LWRP (URP) but nothing works well together.

    Did you follow a tutorial on the Universal RP water, or make it yourself from scratch? If you followed a tutorial or have other good resources on the web, could you post them?

    thanks!
     
    ROBYER1 likes this.
  7. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,399
    Follow this one, although you need a new version of URP with a fix for the scene depth node. See my pinned comment about it on the video.

     
    darryl_wright likes this.
  8. darryl_wright

    darryl_wright

    Joined:
    Jul 8, 2019
    Posts:
    18
    Thanks!
     
    ROBYER1 likes this.
  9. darryl_wright

    darryl_wright

    Joined:
    Jul 8, 2019
    Posts:
    18
    @ROBYER1 Are you able to compile on the Quest with 2019.3? I tried with the beta and got a "NullReferenceException: Object reference not set to an instance of an object" error at the end of the compile. I was thinking "it was just me, this is a beta, etc.". So I went back to 2019.2.6 and started a new project to work on the water shader (thanks again!!). I went through developing the shader on the Rift but I then wanted to see performance on the Quest, so I changed the project to android, etc., and compiled, only to get the same NullReferenceException error.

    I've replaced/rebuilt the android manifest file and done all the normal fixes. I just can't get it to compile.

    If you have seen this, how did you get around it?

    Thanks again for all of your help.
     
  10. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,399
    Hi, I appreciate your response but also nullreference issue could apply to a lot of things such as a script in a scene missing a reference or something broken in the render pipeline or shader itself, can you please share a screenshot of what you are seeing?
     
  11. darryl_wright

    darryl_wright

    Joined:
    Jul 8, 2019
    Posts:
    18
    Hi @ROBYER1 ,

    Sorry for my late response, I was out at the end of the week last week. Also, thanks for your willingness to help others!

    Here are some screenshots from my compile. The scene is not that complex or anything. I have gone through the steps of setup like always, including the deleting/recreating the manifest. It appears that my fail is near the end of the compile
     

    Attached Files:

  12. darryl_wright

    darryl_wright

    Joined:
    Jul 8, 2019
    Posts:
    18
    And here are two other screenshots, one of them is the error in the console, and when I click on that error, it leads me to "OculusBuildProcessor.cs" which apparently is an error from parsing the manifest?

    Again, thanks for any insights.
     

    Attached Files:

  13. darryl_wright

    darryl_wright

    Joined:
    Jul 8, 2019
    Posts:
    18
    Well, I found the answer. I had seen where someone else was having this issue on the Oculus forums, and just checked where someone had replied. I had "V2 Signing" checked in the build settings, which was causing the failure. I'm keeping this up there and wanted to post the solution in case others had the same issue.
     
    ROBYER1 likes this.
  14. jewingo

    jewingo

    Joined:
    Oct 24, 2018
    Posts:
    4
    Weird, I had the same error trying to build with multiple versions of 2019.2... not sure what started it, but it happened after I added terrain I think. I was able to build by just using the "Build" option rather than "Build and Run" in 2019.2.10, and then just moving the apk over with console commands. I left "V2 Signing" on though, as I thought it was required for the app to be store-ready.

    edit: I still get the errors intermittently, but never that pop-up warning and generally the build is still successful.
     
    Last edited: Nov 1, 2019
  15. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,399
    Still having this issue
     
  16. Noors84

    Noors84

    Joined:
    Jul 12, 2016
    Posts:
    73
    Hello, i'm running into issues aswell with the Oculus Quest. So, simple questions, is it possible as for today :
    • with URP, to use the opaque texture ?
      • 7.1.8
        singlepass : mesh isn't displayed.
        multipass : mesh is displayed on left eye, right eye is black
        For both mode, I don't have 72 fps as soon as i activate the opaque texture, even at 1/4 res.

      • 7.3.1
        singlepass : mesh isn't displayed.
        multipass : works, slow as hell (downsized 4x)
        For both mode, I don't have 72fps as soon as i activate the opaque texture, even at 1/4 res.
    • with Built-in, to use the grabpass in singlepass ?
      singlepass : mesh isn't displayed in left eye, right eye totally bugged.
      multipass : works, slow but better than URP.
    Any workaround ?
    They did it in Red Matter, so the Quest can handle it somehow. They used Unreal tho.
    Thanks for your answers !
     
    Last edited: Apr 8, 2020
    JoeStrout and ROBYER1 like this.
  17. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    They didn't do any in red matter. That's a trick.
     
    a436t4ataf and ROBYER1 like this.
  18. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,399
    I had great results for a water material using URP and shader graph with some shader graph wizardry. No need for Opaque texture what do you need Opaque texture for? Depth was all I needed for the depth of water effect (opacity).
     
    hippocoder likes this.
  19. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Probably for distortion but you can use a cubemap to fake that, or other scheme if you wanted to totally avoid resolves.

    Projecting the distortion ortho to verts along camera direction with suitable falloff can do even cheaper distortion.
     
    ROBYER1 likes this.
  20. JoeStrout

    JoeStrout

    Joined:
    Jan 14, 2011
    Posts:
    9,819
    Would you consider writing the details of that up as a blog or forum post? I would find it useful, and I bet many others would too.
     
    gjf likes this.
  21. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,399
    Someone already has! This works on the Quest in URP with Single Pass :D


    If anyone has any issues, let me know!
     
    JoeStrout likes this.
  22. Noors84

    Noors84

    Joined:
    Jul 12, 2016
    Posts:
    73
    Yes i needed it for distortion, but it could also be good to know if i can use it for a blur effect.
    Now i fake the distorsion with a UV noise on the textures under a transparent water plane which i think is how it's done in Real VR Fishing. The tutorial mentioned isnt really suitable for clear water. Also Depthfade isn't working correctly in built-in singlepass, right eye is off, and i don't have the perf in multi. Yay !
    Thanks for answering tho.
    I'm not sure how would i fake refraction with a cubemap ?
    Like so ? :
    https://wiki.amplify.pt/index.php?title=Unity_Products:Amplify_Shader_Editor/Refract
    I don't think it would work in my case, but it's still good to know.
     
    Last edited: Apr 14, 2020
    ROBYER1 likes this.
  23. CGPepper

    CGPepper

    Joined:
    Jan 28, 2013
    Posts:
    147
    Did you benchmark the shader? I bet it costs like 5ms to render that shader
     
  24. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,399
    Those shaders on built in? They don't work correctly in VR so I didn't benchmark, instead moved to Universal RP and made my own water shader there are lots of great guides on youtube you can take bits from and simplify it yourself for mobile VR which is what I have done :)

    I have a link to an older version of my material if anyone wants it just message me.
     
    Last edited: Apr 14, 2020
    hippocoder likes this.
  25. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Quest is basically on par with an xbox 360 or ps3, it's just got different hardware so things like alpha test is a big no, but transparency (water) is not necessarily slow at all.

    The problem with tiled hardware is if you write to an intermediate texture then read from it when drawing something else. This ruins any mobile optimisations mobile GPUs do, and forces a "resolve". This means a tile has to finish completely, which is bad really for a tile based GPU that wants to reject redundant work.

    If you avoid that when doing water shaders, it's no more expensive than a full screen blit + some math and bandwidth. It's not going to be a bottleneck. To put into perspective on ipad1, 10 years ago, you could happily do 2-3 full screen blits in addition to your game and not pay much of a price.

    If though you wanted to have realtime distortion, you'd need to ask URP or other pipeline for a colour buffer, which is going to be costing you around 2ms just for that, and if you need depth you're also going to pay for that too. These things are basically free on regular GPUs.

    Perf cost may vary but my point is it's not really the shader itself that will be slowing most Quest projects down, but the architecture and decisions in the rendering loop that will cause the GPU/driver to bypass hardware optimisations.

    The GPU is the Qualcomm Adreno 540, and this is actually really capable! It's fast, it could do a whole bunch of last gen console games with the right techniques.

    People need to ask Unity about Vulkan, about FFR and so on. I get it that maybe it's Oculus' fault but Oculus isn't doing the engine here, and Unity asked Oculus to provide their own code rather than do it. That's a Unity responsibility.

    I don't mean pitchforks but don't assume the Quest is bad either, it's more of a case Unity isn't rendering as optimal as it could, and the information about working with mobile GPUs is seemingly ignored in VR..

    For Quest you're really looking at wanting to fully understand how tiled GPUs work. You don't need to be a scientist but you do need to know how to avoid messing with tile resolves which basically are the worst thing you can do, but I see so many people constantly doing.

    Not a rant, just peripheral information. A lot of this information can also be found if you search for Oculus Connect on youtube, and go through the recommendations for shaders, read up about the GPU etc... you may just find yourself doing nice things like this:



    (This is 100% achievable in URP now though you may have to go lighter until the FFR is sorted out)
     
    Dan_G and JoeStrout like this.
  26. atomicjoe

    atomicjoe

    Joined:
    Apr 10, 2013
    Posts:
    1,705
    hey @hippocoder , do you know how they did the glass refraction in red matter?
    I'm struggling to get a copy of the framebuffer to use as a texture on my shaders.
    I'm using builtin forward with MSAAx4 and using grabpass only works with multipass mode and is EXTREMELY slow, as you already know.
    I have tried using command buffers and blit, but just doing so makes the Quest render full black.
    I have tried CopyTexture too, since the Quest seems to support it, but it's not working either.
    Fetching the framebuffer is possible in Unity's shader, but you can only get the color of the pixel you are writing to (as long as I know) and that doesn't cut it for refractions.
    What's the trick on Red Matter?
     
  27. atomicjoe

    atomicjoe

    Joined:
    Apr 10, 2013
    Posts:
    1,705
    also, what's the deal with FFR? I'm using it on the Quest without issues and it does render faster.
     
  28. JoeStrout

    JoeStrout

    Joined:
    Jan 14, 2011
    Posts:
    9,819
    Following up on this, I found this article that points out a lot of specific things people might do that performs poorly on Quest.

    @hippocoder, if you have other no-no's in mind that aren't mentioned there, I would love to hear them!
     
    Martin_H likes this.
  29. Noors84

    Noors84

    Joined:
    Jul 12, 2016
    Posts:
    73
    atomicjoe likes this.
  30. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,399
  31. Starmind01

    Starmind01

    Joined:
    May 23, 2019
    Posts:
    78
    I have tested crest URP version and it works. It has some problem with transparency in the right eye and with the ui. Anyone have an idea or guide to setup so it works??
     
  32. Dan_G

    Dan_G

    Joined:
    Dec 4, 2017
    Posts:
    27

    Wow @hippocoder that is an awesome demo! Seems like you can do really cool stuff in Quest with the right settings and the right shader...

    I would love to know your thoughts on the "new" Quest 2. I wonder if it is really twice faster than Quest 1 as the specs say... Do you know if Unity fixed the FFR in any of the latest stable versions?

    I am about to start a project on Quest 2 and it is going to be quite heavy as it will be mainly using volumetric and real-time lights, lots of shaders (not sure if i should go for PBR or Unlit?) and quite a lot of post-processing (which i am a bit affraid of in Quest...).

    Thanks!
     
  33. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Try things out :)
     
    ROBYER1 likes this.
  34. atomicjoe

    atomicjoe

    Joined:
    Apr 10, 2013
    Posts:
    1,705
    Forget it. Seriously.
    You still can't make a copy of the render buffer to use for post effects.
    The only post effects you can do are the ones that don't need to access other pixels than the one currently rendering, that is tone mapping and color correction. And even for that you should avoid post render blits and apply the effects directly in the object's shader.
    Also, shading complexity is not really an issue, but sampling textures does tank the framerate considerably, so avoid look up textures, 4k textures and PBR textures but calculate things procedurally in the object's shader instead.
    Of course, this means you will have to code your own shaders and it can be tricky.
    There is a reason why 90% of Quest titles use a flat shading aesthetic without textures.
    Developing a regular PC looking game for the Quest is REALLY challenging.
     
  35. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,399
    Only if you are relying on post effects which are expensive to render
     
    hippocoder likes this.
  36. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Or fun if you're like me.
     
    atomicjoe likes this.
  37. atomicjoe

    atomicjoe

    Joined:
    Apr 10, 2013
    Posts:
    1,705
    Ok, have you been able to make a copy of the render target and use it to do things that need to access random locations of the image like simple distortion, blur, chromatic aberration or anything like that?
    I have been trying to make that with the Quest for a year now and so far it has been impossible.
    The only post effect that works is using Framebuffer Fetch, and it's way slower than just applying the same effect directly in the object's shader. (and can only access the same pixel position)
    Am I missing something? I would be glad to be proven wrong, really.
     
  38. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    No, but that's not required for amazing visuals is it? And my recommendation above is to avoid doing framebuffer stalls. For this to ever change, it has to happen on Unity's side, using the vulkan renderer above all. It's not really efficient with ES, and can't ever be.

    In VR though we seldom need post at all. Instead, Unity really has to stop depressing me. I mean you could have HDR buffers with tone mapping on Quest 2 today, with a reasonable cost with Vulkan thanks to tile cache locality, from what I've read, but you can't in Unity because they haven't done that.

    Company has billions though. Just not for this.

    For me, I come from a background of simpler/older techniques and have worked carefully around the need for any post effects. I recommend people do that or maybe someone at Unity can respond to this.
     
    Martin_H likes this.
  39. atomicjoe

    atomicjoe

    Joined:
    Apr 10, 2013
    Posts:
    1,705
    I don't get it. I mean, you can do tone mapping of an HDR buffer using the Framebuffer Fetching technique and a blit as a postprocess. Although I personally integrate the tone mapping directly inside my custom shaders and avoid the fetching cost, the overdraw cost of the blit and the bandwidth cost of using HDR buffers (as you output the final color as an 8bit value per channel. Since you can't do glow postfx anyway, there is no need to store HDR values in the render target if you already tone map inside the shader).
    That's what I do and it's super fast. (don't use LUTs, just make it procedurally. It's way faster.)
    The only thing I'm missing is glow currently.
     
  40. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    The problem I have with doing tone mapping inside the shader is that it doesn't really work unless you also handle all the lighting yourself too, basically don't use Unity's Lit shaders, unless I'm missing something.
     
  41. atomicjoe

    atomicjoe

    Joined:
    Apr 10, 2013
    Posts:
    1,705
    With builtin render and surface shaders you can define a FinalColor method specifically for color correction that will be applied after all lighting (look the surface shader docs for FinalColor).
    For URP, you will have to render a full screen quad with a custom shader that fetches the framebuffer color, correct it and rewrites it. There is an example somewhere in the docs for using framebuffer fetch, but it will only work on tile render GPUs, so it will work on the quest but not on a PC or a Mac. However you can fallback to another shader and use a grabpass in those cases, since performance will not be as critical :)
     
  42. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
  43. Dan_G

    Dan_G

    Joined:
    Dec 4, 2017
    Posts:
    27
    Emmm i guess you are talking about both Quest and Quest 2? I thought Quest 2 would be way more performant since the hardware is much better....
     
  44. atomicjoe

    atomicjoe

    Joined:
    Apr 10, 2013
    Posts:
    1,705
    Both, yes.
    It is, but it uses the same rendering technology with the same strengths and weaknesses.
    The GPU and CPU are both faster than the Quest 1, but you still can't do some things like getting a copy of the framebuffer to use as a texture, as it will tank frame rate absolutely. Even Oculus warned developers against trying to do post processing effects when the Quest 2 launched.
    There is nothing significantly different with the Quest 2 compared to the Quest 1 technology wise, so the same best practices and restrictions apply.
    Again, there is a reason why 90% of the Quest catalogue is really sparse on textures and has no screen space post-effects.
    The Quest 2 has a higher bandwidth than the 1 though, so textures are not so much of an issue, but still.
     
  45. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Quest2 stalls because of the current architecture with URP. With Vulkan, you can have HDR buffers and read from local tile cache during a frame without stalling, but that is not available in Unity's implementation. I got my information from Oculus.

    You should be asking why we don't have the optimal VR engine for Quest 2, and I suspect there would be no reply because of business reasons. This is not nefarious, just we are currently just plain not important enough resources wise to pump code into a Vulkan VR renderer for tile deferred GPUs. This is the same over in Unreal land as well.

    Problem is unlike Unreal land, it's not yet possible here until the Vulkan implementation and whatever story between Oculus and Unity wants to be told.

    Until then I have to avoid ALL post effects and make a different looking game if I use Unity at my level of budget, going with URP.

    If Unity staff or better informed people want to add or correct I would very much welcome that for positive reasons.
     
  46. atomicjoe

    atomicjoe

    Joined:
    Apr 10, 2013
    Posts:
    1,705
    Are you sure you're not talking about framebuffer fetch? (down the page)
    Framebuffer fetch is perfectly possible (I do it) but doesn't solve the postprocess issue, since you can only access the specific pixel you're writing to, no random access to other pixels in the framebuffer, so it's only useful for color correction and custom blending, but not image distortion.

    If you're not talking about framebuffer fetch, can you point me to a source about that?
     
  47. atomicjoe

    atomicjoe

    Joined:
    Apr 10, 2013
    Posts:
    1,705
    Also, last time I checked, GLES3 was way faster than Vulkan on the Quest.
     
  48. hippocoder

    hippocoder

    Digital Ape Moderator

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    I don't know if I am talking about Framebuffer Fetch as used there. Specifically, I'm talking about Vulkan Sub-passes which you don't get in ES land: https://www.khronos.org/assets/uplo.../2016-vulkan-devday-uk/6-Vulkan-subpasses.pdf

    This gives us limited but useful post effects at very low cost, bloom would be possible, I think.

    As for speed, GLES3 would definitely be faster if the Vulkan engine sucked. But what if Vulkan was finished in Unity, and optimised? Again, I don't think it's the biggest priority for VR right now for Unity. I don't know why, I mean Unreal shipped theirs on the tail end of a couple of years ago.
     
  49. atomicjoe

    atomicjoe

    Joined:
    Apr 10, 2013
    Posts:
    1,705
    As much as I like Vulkan, it's still too young and the support is not great across all GPU vendors.
    Some devices like the Nvidia Shield with it's TegraX1 have an excellent Vulkan support (15% faster than GLES), but others don't, like Snapdragon GPUs (which is the one the Quests use).
    It will take years for Vulkan to replace GLES and, for the time being, GLES is faster and more stable in general just because the drivers are better and more optimized.
    For the Quest in particular, GLES is really faster and it's the Oculus oficial recommendation.
    Anyway, GLES or Vulkan are only the software interface the CPU uses to communicate with the GPU, so you will only get the benefits of a better communication between both, but it will not change the GPUs capabilities or its raw performance.

    What they're talking here is about using the framebuffer fetch to make several passes one over the other for a final render. Being that Vulkan has less overhead on the CPU-GPU communication path, this can be done much faster than with GLES, because each pass has to come from the CPU to the GPU.
    This will not help us with glow or blur, sadly. Only for color correction, tonemapping and custom blending like I said before.

    What you can do NOW is write your own tonemap postprocess effect with a custom shader using the framebuffer fetch extension that reads back the pixel you're about to overwrite and do the tonemap yourself, like I explained some posts above. (and it will go as fast in GLES as it will ever be in Vulkan, because it's all on the GPU side)
    I have done it and it works. But I don't recommend it because:

    1- using an HDR framebuffer has a lot of overhead on the Quest1/2 because it eats a lot of bandwidth.
    2- using an HDR framebuffer forces Unity to render to an intermediate buffer and then dump it to the framebuffer at the end of the render, which has some overhead. (shockingly not a lot, but some overhead nonetheless)
    3- fetching the framebuffer is not free. It's not that slow, but you get a toll for using it.
    4- since you can't access other pixels than the one you are currently rendering on, it's only useful to tonemap or color correct, and those are things you can integrate inside your shader to begin with.
    EDIT:
    5- applying a whole screen postprocess is effectively overdrawing, which kills bandwidth even more.

    The thing is, it is way WAY faster to just write custom shaders that output an already tonemapped color to a non-hdr framebuffer than do all of this.
    The performance is nearly double, so that's what I do.

    I only use the plain old built-in forward renderer and surface shaders because I don't like SRP, but as far as I know, for URP, you can use the shader graph to make a custom shader that outputs a tonemaped color instead of an HDR one. Just apply some gamma, brightness, contrast and color corrections before the output and you'll get the same result as using the postprocess stack tonemaper.

    You CAN'T do anything better currently with the Oculus Quest 1 or 2. Otherwise, the guys of Red Matter would have done it already :p
     
    Last edited: Jan 21, 2021
    Dan_G likes this.
  50. atomicjoe

    atomicjoe

    Joined:
    Apr 10, 2013
    Posts:
    1,705
    The thing about tonemapping in 3D engines is that it's not "real" tonemapping.
    I mean, it's effectively bringing an HDR color down to an acceptable level for a common LDR display, but doesn't take in consideration the pixels around it nor it makes any composition of several exposed images, contrary to HDR photography.
    In HDR photography, the tonemapping process DOES take into consideration the image as a whole, combining different photos per zones into a final image, and that's what gives an "HDR photograph" that specific look. (check out the wikipedia page on HDR photography here for a very comprehensive explanation)

    In 3D engines, it only remaps an HDR value to an LDR value.
    The reason why engines nowadays use HDR buffers is just because you can use them to store MORE information than LDR, and that's useful to do things like GLOW PostFXs on zones of the image were the luminosity value is over LDR ranges.
    But if you can't do Glow, like in the case of the Quests, there is no useful reason to use HDR buffers to begin with.
    So just ditch HDR and calculate the tonemapped color directly in the shader.
    It will be as pretty and will be blazing fast.