Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.

Question How can you encode bit patterns in shader output without having it changed in TempBuffer?

Discussion in 'General Graphics' started by LostBaggle, May 29, 2020.

  1. LostBaggle

    LostBaggle

    Joined:
    Jan 26, 2018
    Posts:
    1
    For a while now, I've been experimenting with different ways to use shaders in Unity's built-in render pipeline. My current goal is to store more data in my shader outputs by encoding different values into the channels. With this approach, I'm hoping to create a visual style that has less color depth, but is able to produce special effects by reading the encoded bits in post-processing.

    However, I've been having trouble doing this because of how the pipeline handles the data as it travels through the "Drawing" command. Regardless of the shader's output format, the intermediate RenderTarget in the RenderPipeline is R16G16B16A16_SFLOAT format. By the time the buffer gets passed on to Camera.ImageEffects, the bit patterns have already been erased by optimizations and conversions.

    Is it possible to change the data formats of these internal buffers before everything gets written out to the display?. I've looked into using the command buffer, but I haven't cracked the code of how to apply it here.

    I'm attaching an image of my frame debugger in case my explanation of the render target isn't clear. The relevant info about the RenderTarget is in the red square.
    upload_2020-5-29_17-9-51.png
     

    Attached Files:

  2. AleksiUnity

    AleksiUnity

    Unity Technologies

    Joined:
    Mar 31, 2020
    Posts:
    6
    Hi.

    Currently it's not possible to change the render target formats.

    Maybe you could try binding RWTexture2D and write the extra data there and then read it in post-processing?