Search Unity

  1. Unity 2019.1 beta is now available.
    Dismiss Notice
  2. The Unity Pro & Visual Studio Professional Bundle gives you the tools you need to develop faster & collaborate more efficiently. Learn more.
    Dismiss Notice
  3. We're looking for insight from anyone who has experience with game testing to help us better Unity. Take our survey here. If chosen to participate you'll be entered into a sweepstake to win an Amazon gift card.
    Dismiss Notice
  4. Want to provide direct feedback to the Unity team? Join the Unity Advisory Panel.
    Dismiss Notice
  5. Unity 2018.3 is now released.
    Dismiss Notice
  6. Improve your Unity skills with a certified instructor in a private, interactive classroom. Watch the overview now.
    Dismiss Notice

Refraction effect - how to apply it to my grab pass

Discussion in 'Shaders' started by PixelizedPlayer, Feb 10, 2019.

  1. PixelizedPlayer

    PixelizedPlayer

    Joined:
    Feb 27, 2013
    Posts:
    329
    Hi

    I am confused how i refract my scene behind the object when looking through it.

    I get the grab pass texture but i don't know how to offset it for the ratio of refraction to distort the texture.

    How do you use vector R from world space on a sampler2D for uv's so that it distorts correctly? I'm having a hard time understanding how to do this in code.

    This is where i am at:

    Code (csharp):
    1.  
    2. //fragment shader
    3. float3 refractionDirection = refract(-i.viewDir, i.normal, _RefractionIndex); //(camera to frag, normal , refract index)
    4. float3 refractionPos = i.vertObjPos.xyz - (refractionDirection * _RefractionFactor); //refraction object pos
    5. float4 refractionClipPos = UnityObjectToClipPos(refractionPos); // obj to clip
    6. float4 refractionScreenPos = ComputeGrabScreenPos(refractionClipPos); // convert to uv on grab pass texture
    7.  
    8. float3 refractColor = float3(1,1,1);
    9. if(waterDepth >= 0){
    10.     refractColor = tex2Dproj(_BackgroundTexture, UNITY_PROJ_COORD( refractionScreenPos) ).rgb;
    11. }
    12.  
    13. return fixed4(col * refractColor ,alpha);
    14.  
    15.  
    But i am getting some wacky results - not sure how to fix it.
    Refraction values: _RefractionIndex = 1.333



    Visual of the current effect: https://i.imgur.com/8iWdIrM.gif
     
    Last edited: Feb 11, 2019 at 5:35 AM
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    5,731
    Using a refraction to offset a world or local space position for this may seem like the correct and logical path, but it’s missing one key thing.

    It’s wrong.

    I suspect you might already know this, but let’s dive in a bit.

    Let’s step back and look at this from a ray traced perspective. A ray direction is refracted by the angle of incidence and index of refraction. That’s all good, and the reflect function with those inputs, assuming they’re all in object space, will give you the new ray direction. But here’s where things fall down. That ray should continue out until it hits something, not travel some arbitrary distance and then change direction to that new position’s view ray. Plus by using object space the object’s scale will affect the distance and direction too.

    So you could do all of this in world space instead, and then the offset would be at least consistent, but that doesn’t solve the ray direction going “wrong”.

    But you of course can’t really trace along the refracted ray direction*, most of the time the ray will be heading off screen so there’s nothing to sample. And this is just supposed to be an approximation anyways, right? The problem with doing this in either object or world space is there’s a not insignificant chance that the resulting offset position is outside the screen bounds too. That indeed is what looks to be happening in your example above. I’m going to guess you have an object that’s being scaled thus the refraction factor is going way outside the bounds of screen space.

    So, what can you do? A few things. The first one is to do all of this in world space so that object scale doesn’t have any impact. Second is to calculate the direction in world space, but apply it as an offset direction in screen UV space. Why? Because you will likely need to clamp it, or fade it out when the offset goes off screen. Alternatively you could fade to a refleflecfion probe sample when the ray goes toward the camera or the offset goes off screen.

    Really this usually gets solved in a super hacky way of converting the surface normal direction into screen UV space and using that multiplied by some small scaler and being done. No real refraction at all.

    This old GPU gems article skips even bothering to covert to screen space and just uses the normal map as is.
    https://developer.nvidia.com/gpugems/GPUGems2/gpugems2_chapter19.html

    * Actually, you can, this is what screen space reflections do.
     
  3. PixelizedPlayer

    PixelizedPlayer

    Joined:
    Feb 27, 2013
    Posts:
    329
    Hi thanks for the reply.

    I have the GPU gems bookmarked, that was my backup plan for in the event this ray casting approach failed.

    I have made much better progress which you can see here: https://i.imgur.com/EetnZic.gif

    The problem is its doing refraction on the water behind the object even if it's above the water. I tried doing depth comparison to skip where its above the water, but it doesn't quite work.

    So some one suggested (but it was not easy to understand) to make a render texture of only whats under the water and then also some how put the depth values in the alpha channel to rid of the depth buffer entirely.

    The issue is i am not even sure how i would render a texture of objects under the water including objects "partially" through the water. I haven't found much info on it for unity or at least the technique they are trying to describe. I also am not sure how it works if the water has animated waves.

    Perhaps you know what they were talking about with that?


    Also encase you are wondering why i am adamantly going down this route still at the moment, it's because of this webGL demo that used the same technique: http://madebyevan.com/webgl-water/

    They used the same approach and the refraction looks damn good in my opinion, but sadly they didn't do a write up on it other than the caustics effects.

    Current code i have:


    Code (CSharp):
    1.  
    2. //vertex shader
    3. float3 worldNormal = UnityObjectToWorldNormal(v.normal);
    4. float3 objToEye = WorldSpaceViewDir (v.vertex); //obj to camera in world space
    5. float3 refraction = normalize( refract(-objToEye,worldNormal, 1.0/_RefractIndex));
    6. float3 objRefraction = mul(unity_WorldToObject,refraction) * _RefractDistance; //zoom scale
    7. float4 newvertex = UnityObjectToClipPos(float4(objRefraction,v.vertex.w));
    8.  
    9. o.refractuv = ComputeGrabScreenPos(newvertex);
    10. COMPUTE_EYEDEPTH(o.refractuv.z);
    And


    Code (CSharp):
    1.  
    2. //frag shader
    3. float sceneDepth = tex2Dproj(_CameraDepthTexture,  UNITY_PROJ_COORD( i.refractuv)  ).r; //sample depth texture
    4. sceneDepth = LinearEyeDepth(sceneDepth); // from perspective to linear distribution
    5.          
    6. float waterDepth = (((sceneDepth-i.refractuv.z)/_FadeFactor));
    7. float uvDepth = saturate (waterDepth);
    8.  
    9. fixed3 col = tex2D(_WaterDepth,float2(uvDepth * _ColorRange , 1)).rgb;
    10. half alpha = tex2D(_WaterDepth,float2(uvDepth ,0)).r;
    11. alpha = saturate(alpha + _MinimumAlpha);
    12.            
    13. if(sceneDepth<0) return (col,alpha);
    14. float3 refractColor = tex2Dproj(_BackgroundTexture, UNITY_PROJ_COORD( i.refractuv )).rgb;
    15.  
    16. return fixed4(col * refractColor, alpha);
    17.  
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    5,731
    Like this?
    https://catlikecoding.com/unity/tutorials/flow/looking-through-water/

    It’s not perfect, but it works well enough that most people won’t notice.

    Sure. It requires rendering with a custom projection matrix that places the near plane at the water surface. Real time planar reflections work the same way, just with the projection flipped. I think Half-Life 2: Lost Coast was one of the first games to use this technique. It’s pretty much never used anymore because depth rejection is good enough. At most they make a copy of the screen buffer with the above water area blacked out to do stuff like blurs. See this page:
    https://eidosmontreal.com/en/news/hitman-ocean-technology

    Except that’s doing actual full on actual raytracing. Raytracing a sphere and box in a shader is fairly straightforward and cheap, so he can do real refraction and then follow the ray until it intersects the wall or sphere, and otherwise falls back to a cubemap. The entire scene’s contents fit into the shader itself. You’re not raytracing, you’re doing a screen grab and displacing the UV sample.
    https://github.com/evanw/webgl-water/blob/master/renderer.js
     
  5. PixelizedPlayer

    PixelizedPlayer

    Joined:
    Feb 27, 2013
    Posts:
    329

    Thanks for the reply.

    Regarding this part i am not sure why but my depth rejection does not work at all. As you can see in my second post i do skip where the depth value is negative and if you look at the gif i still get refraction occurring behind my object as if i am not doing any form of checking. So i am confused why its the general approach but i can't get mine to work at all.