Search Unity

Using a texture rendered with a camera as a shader parameter

Discussion in 'General Graphics' started by FoxyShadow, Feb 28, 2018.

  1. FoxyShadow

    FoxyShadow

    Joined:
    Aug 14, 2016
    Posts:
    41
    I'm questioned about how to use a texture rendered by camera as a shader parameter.
    1) I have the first camera rendering things.
    2) I have the second camera rendering a certain layer which is not rendered by first camera.
    3) I want to add a shader effect to the first camera using texture rendered by second camera as a shader parameter.

    How does one should get a texture from the second camera and give it to first camera's material as a parameter?

    Thanks for any help =)
     
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,343
    Make a render texture, assign it as the second camera's target texture and the material's texture parameter. You can do this in the editor by making a render texture asset or from script at runtime. Make sure the second camera's "Depth" setting is lower that the first camera so it renders first, otherwise you'll be getting the previous frame's rendered image.
     
    theANMATOR2b likes this.
  3. FoxyShadow

    FoxyShadow

    Joined:
    Aug 14, 2016
    Posts:
    41
    Thank you again!
    But... By the way.. I've got a little other problem. Second camera renders transparent objects and I want get a texture from the second camera with transparency. Setting background color to 0,0,0,0 for second camera works strangely. Is there a way to achieve that?
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,343