Search Unity

Mobile VR Water Shader for Oculus Quest/Go Single Pass Forward Rendering

Discussion in 'VR' started by ROBYER1, Sep 4, 2019.

  1. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Great thoughts, thanks, love conversations like this.

    I'm at a loss how to achieve fast tone mapping in-shader with URP. URP just doesn't expose their lighting in a place where I can grab that final pixel before it is rendered. What happens is you write how your pixel looks, but then they mix that with their lighting somewhere else, outside of shadergraph land. Shadergraph isn't even the slow part, bandwidth is as you know). I would love to know how to grab that pixel in URP to process it.

    I guess the framebuffer technique you're talking about means I don't need to worry where Unity does their own lighting?

    Are you saying your technique is possible in URP as well? If so then please educate me as I would love that. My lights are URP lights at the moment and tend to blow out pretty badly as you imagine. If I can avoid that, it'll make lighting the game considerably easier.

    Alternatively I could go unlit, and do all the lighting myself but that's just a bit much for one guy I think., on top of all the other jobs to do.
     
  2. atomicjoe

    atomicjoe

    Joined:
    Apr 10, 2013
    Posts:
    1,869
    I just made a fresh URP project to tinker out with URP shader. (I don't have experience with URP, but one has to start someday! :p )
    The shadergraph doesn't expose the final pixel result to modify it, so it seems not possible to make a custom URP shader with shadergraph that outputs a color corrected pixel.
    Seems like a HUGE omission to me. Just exposing the final pixel output for modification would make things sooo much easier...
    I then inspected the actual generated code and it is THOUSAND UPON THOUSANDS LINES LONG.
    So, yeah, not really possible to modify...
    The best bet would be to modify the original URP Lit shader instead, but I don't have time to dive in it right now (and it would break any time Unity updates it)
    The correct solution would be to expose the final generated color in the shader graph. This way we could plug a custom node in there to do the tonemap and color correction. Time for a feature request I guess...

    This gives us no other choice than to use the framebuffer fetch posteffect, which I'm not completely sure will work, but it should.
    I'll check it out in 2 to 3 hours and get back to you :)
     
    hippocoder likes this.
  3. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    URP has better CPU usage than built-in and for some projects that might matter. Would really suck if it's not actually possible.

    Wonder what Unity thinks?
     
  4. atomicjoe

    atomicjoe

    Joined:
    Apr 10, 2013
    Posts:
    1,869
    hippocoder likes this.
  5. hippocoder

    hippocoder

    Digital Ape

    Joined:
    Apr 11, 2010
    Posts:
    29,723
    Oh wow! I think it might be <3 Thanks for sharing, going to check that right out!
     
    atomicjoe likes this.
  6. MDADigital

    MDADigital

    Joined:
    Apr 18, 2020
    Posts:
    2,198
    You guys talk Vulkan? I have tried for years, for YEARS to get Unity to get Vulkan and gfx job stable in VR. They have even set a not gonna fix label on those issues. They just dont care
     
  7. atomicjoe

    atomicjoe

    Joined:
    Apr 10, 2013
    Posts:
    1,869
    To be fair, Vulkan drivers are so finicky that Unity can't do anything.
    On the Nvidia Shield Vulkan works GREAT.
     
  8. MDADigital

    MDADigital

    Joined:
    Apr 18, 2020
    Posts:
    2,198
    They have no say in the Khronos group?
     
  9. ray20ray

    ray20ray

    Joined:
    Aug 7, 2020
    Posts:
    33
    how does the Reflection work? as above many ppl said FrameBuffer is not able to be used as texture.
    thank you.
     
  10. ROBYER1

    ROBYER1

    Joined:
    Oct 9, 2015
    Posts:
    1,454
    I didn't find a good way to do real reflection but you can fake it, or take a look at the Unity Boat Attack repo as that is being given VR support so you can see how they do it.
     
    glenneroo likes this.
  11. ray20ray

    ray20ray

    Joined:
    Aug 7, 2020
    Posts:
    33
    thank you! it seems like real reflection in this picture...how to fake the reflection?some keyword to search?eg.in your picture how to get the reflection of these cube models if use the fake method, have no idea if not use the frameBuffer as texture.
    thanks and I'll search Unity Boat Attack first.
     
  12. creat327

    creat327

    Joined:
    Mar 19, 2009
    Posts:
    1,756
    Anyone got grabpass to work in the Oculus Quest using single pass rendering? Is there a way to simulate that feature otherwise?
     
  13. atomicjoe

    atomicjoe

    Joined:
    Apr 10, 2013
    Posts:
    1,869
    Yes: use prerendered cubemaps or Unity's reflection probes. Of course, it will only reflect static objects.
    For dynamic objects, there is no solution right now.
     
    Beauque likes this.
  14. creat327

    creat327

    Joined:
    Mar 19, 2009
    Posts:
    1,756
    I´m googling that right now. Is there an updated unity 2020 tutorial on how to do those cubemaps or reflection probes with a shader on android?
     
  15. atomicjoe

    atomicjoe

    Joined:
    Apr 10, 2013
    Posts:
    1,869
    Just make a reflection probe and assign a highly reflective material to a plane (maximum smoothness) with normal maps for distortion.
     
  16. Starkium

    Starkium

    Joined:
    Oct 13, 2017
    Posts:
    2
    Hey, UE4 dev here. Picking up quest vr dev in unity to see what it's like compared to UE4. Sounds like ultimately you guys have the same issues we do minus some overhead and you have some more mature rendering features.

    Just bought VRIF, really want to get started on an zelda kind of game. In UE4, I made all my materials unlit and built in a vertex lighting system. I'd like to compare apples for apples with unity in that same kind of set up. The water shown above seems pretty decent, is this the best looking and performant example there is available or is there a store asset anyone suggests?
     
  17. atomicjoe

    atomicjoe

    Joined:
    Apr 10, 2013
    Posts:
    1,869
    That's good to know! Thanks for the comparison.
    Personally, I just use a semi-transparent material with animated normalmaps + maximum specular reflecting prerendered reflection probes with box projection enabled.
    You can also apply some vertex displacement using the same normal maps or some height maps.
    This is the best quality you can get on a Quest currently without killing the frame rate.
    And even then you will have to deal with performance issues because of overdraw, so you have to be careful what renders beneath the water surface.
     
    Beauque and TrentSterling like this.
  18. Ebonicus

    Ebonicus

    Joined:
    Oct 31, 2016
    Posts:
    158
    If anyone has a working water with reflection in built in render pipeline for Oculus I will buy it.
    I am still having the OP's issue with reflections being from different angles. The performance isn't bad at all, it is just the fact that the reflections make you crosseyed, even when using center single camera. I have no clue why Unity is passing different images to the eyes when center camera is being used.
     
  19. Cam-Alta

    Cam-Alta

    Joined:
    Feb 3, 2023
    Posts:
    1
    Has anyone tried using spherical harmonics for depth?

    I had a random idea the other day when thinking about using SH for better fog coloring for a different issue, then a random brain spark about using per-vertex SH to encode view depth up to say 1m....

    Although thinking about this, surely it's been done before for subsurface scattering and other forms of translucency so I did a quick search but couldn't find much. So either I can't google well (likely), no one has thought of this before (highly unlikely), or it's just a stupid-ass idea and I don't understand SH well enough (most likely).

    There would be limitations of course.
    • Depth information would be very low frequency (blurry) - any high frequency details (sharp changes from low to high depth etc) would be lost but perhaps low frequency depth is enough to fool the eye? Pair this with some artistic constraints on the frequency of underwater geometry and players may never notice that depth isn't 100% accurate.
    • You would have to pre-process and bake the depth into the vertices of the water mesh.
    • It's a static representation of depth but water tends to have a lot of dynamics in games with objects entering or leaving the water volume thus dynamically changing its depth so not suitable for all game types.
    • You would have to set an arbitrary depth limit, but for water we generally don't need absolute depth. 0.5m to 1m of depth encoded into the surface would be more than sufficient for the majority of visual effects we would want to render anyway
    • Baked depth would be static so doing wave animations per vertex would invalidate that information however I suspect there would be ways to work around that.
    • I legit have no idea how expensive it would be to evaluate SO coefficients per vertex - perhaps the whole concept is just not feasible for the Quest GPU.
    I don't know enough about spherical harmonics to attempt this myself right now as I don't have the time to dig into the math. Perhaps others here are more familiar with SH and can try it or tell me all the reasons why this is, indeed, a stupid-ass idea (as that would also be super informative and helpful).
     
    Last edited: Apr 2, 2024