Search Unity

Question How to use a cube render texture as cubemap in ShaderGraph?

Discussion in 'Shader Graph' started by jonathannjules, Mar 4, 2021.

  1. jonathannjules

    jonathannjules

    Joined:
    Aug 26, 2020
    Posts:
    2
    Hello. I am using Camera.RenderToCubemap(RenderTexture rt) to get a real-time cubemap that i can feed into my ShaderGraph shader. I am using the overload that takes a cube RenderTexture instead of a Cubemap because Unity said that is what you should do for real time implementations. However, I cannot figure out how to sample rt as a cubemap in ShaderGraph. The Sample Cubemap node specifically takes a "Cube (C)" input, so I cannot feed the render texture into it. It seems like there should be a way to do it, since this is the specific use case of cube render textures outlined in the docs. If someone knows how I would love to hear it.

    The only reason I am even doing this is to implement locally correct cubemap reflections. This feature should already be in Unity, and is in the built in pipeline, with the "use box projection" option on reflection probes, but this option appears to be totally broken in URP (am I even surprised at this point?). So here I am. I almost got it working, I just need to feed a real-time updated cubemap from the camera position into the shader.

    Side note: I also tried creating a cubemap in script and assigning the internal texture of that cubemap to my render texture. Did not seem to work. I also tried using a real time reflection probe and grabbing it's cubemap. But I do not see it generating a cubemap in realtime, only a regular texture (interestingly enough, I do see baked reflection probes generating a cubemap). If there is some other way to get any of these methods working I would be glad to take those as well.
     
    Frimus likes this.