Search Unity

Graphics.Blit() doesnt work on iOS when a Material is used

Discussion in 'Editor & General Support' started by SantosR, Mar 28, 2014.

  1. SantosR

    SantosR

    Joined:
    Dec 18, 2013
    Posts:
    27
  2. Alexey

    Alexey

    Unity Technologies

    Joined:
    May 10, 2010
    Posts:
    1,624
    bug report with repro project and drop here case number
     
  3. SantosR

    SantosR

    Joined:
    Dec 18, 2013
    Posts:
    27
    My case number is 599826.

    Thanks,
     
  4. Alexey

    Alexey

    Unity Technologies

    Joined:
    May 10, 2010
    Posts:
    1,624
    the error is so common i'll reply here. So, you have RenderTexture *with* depth (you specified non-zero depth in the inspector, if you would create it from code it is the constructor param, or you can change through property).
    Now, you use "Unlit/Transparent" shader, which, as any normal transparent shader, have "ZWrite Off ZTest LessEqual". So what happens is that your blit depends on zbuffer, AND your RT have zbuffer, but you NEVER init it. If you check under instruments on ios that means 0 (black) and in that case you can never pass "LessEqual" test (even on desktop i get noisy result due to garbage in zbuffer).
    So what you should do: OR get rid of RT depth (set to 0) OR tweak the shader (copy the builtin and rename) to have "ZWrite Off ZTest Always"
     
    GroovyKoala likes this.
  5. SantosR

    SantosR

    Joined:
    Dec 18, 2013
    Posts:
    27
    Thanks for your time checking the mistakes I had made!

    I was already using "ZWrite Off ZTest Always" in the actual shader used on the project I'm working at but it didnt fix my problem, however the depth buffer nailed it!

    I come from an actionscript background and all this shader talk is new to me, could please leave me a link to a small explanation on how to initialize the depth buffer? I had no success googling for it.

    Thanks again.
     
  6. Alexey

    Alexey

    Unity Technologies

    Joined:
    May 10, 2010
    Posts:
    1,624
  7. SantosR

    SantosR

    Joined:
    Dec 18, 2013
    Posts:
    27
    I'm stuck in another problem right now, which doesn't seem to be shader related.

    Whenever I blit to a renderTexture that isn't opaque, the alpha values won't change, only the rgb. So if I have a transparent renderTexture, blitting on it will result in no visible changes, unless I the material I'm apply this texture to uses a shader that doesnt support transparency.

    Is it intended to work like this? If so, how am I supposed to change a renderTexture's alpha values?



    ps: alpha values are correctly updated if I don't pass a material
     
  8. elhispano

    elhispano

    Joined:
    Jan 23, 2012
    Posts:
    52
    This worked for me. If it is a common error it would be great if this trigger some kind of warning or error.
     
  9. GroovyKoala

    GroovyKoala

    Joined:
    Feb 16, 2022
    Posts:
    23
    Setting 0 to the RenderTexture depth after being created solve my issue!

    Question @Alexey why this is not generated or fixed on the engine? there's a specific reason?
    Using Unity 2021.2.12f1

    Thanks!
     
  10. Alexey

    Alexey

    Unity Technologies

    Joined:
    May 10, 2010
    Posts:
    1,624
    >why this is not generated or fixed on the engine?
    what exactly should be "generated" or "fixed" in engine?
     
    Kurt-Dekker likes this.
  11. GroovyKoala

    GroovyKoala

    Joined:
    Feb 16, 2022
    Posts:
    23
    Hi @Alexey, the Render Texture is created it uses the image descriptor from the source in the OnRenderImage method. The build is targeted for WebGL, and the RT displays a black texture when using a couple of consecutive Blit instructions. It only happens opening the app on mobile (iPhone) but not on the PC Laptop or Mac.

    So my question is, why should reset the depth as 0, instead of the engine handling that?

    Thanks!