Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Post Process Mobile Performance : Alternatives To Graphics.Blit , OnRenderImage ?

Discussion in 'Image Effects' started by TreasureMap, Jul 1, 2016.

  1. TreasureMap

    TreasureMap

    Joined:
    Apr 23, 2014
    Posts:
    14
    I am creating an outline glow (aura) effect for a mobile game (android) and have noticed that the cost of a graphics.blit is quite high. Even only doing a "blit(source,dest)" and nothing else is slow (-5~-7fps).

    I wanted to know if there are any methods / techniques that can reduce the decrease in frames ?
    Any alternatives to using graphics.blit ? (Render to a full screen quad ?, Render MainCamera To RT, then do post work, then composite and blit to screen ?)

    I can manage to get around 40 FPS, but its not an acceptable frame rate.
    If there really is no way to do Post Render work on mobile, I will go a different route.

    About my setup:
    Testing on a Samsung Galaxy S5 with Android 5.
    I have a post effect script attached to the main camera.
    Main Camera does NOT render to an RT. It just renders normally.
    I am doing my post effect work in "OnRenderImage".
    I am using 2 or 3 temporary render textures at runtime (RenderTexture.GetTemporary)
    I have a color RenderTexture and a depth RenderTexture. The size is a quarter of the source RT.
    I have some CommandBuffers set up to render the models that I want to create the outline glow effect for.
    The models all get rendered to my temp RT, then the RT is blurred, then composited with the source RT.
     
    Last edited: Jul 1, 2016
  2. PrisedRabbit

    PrisedRabbit

    Joined:
    Aug 14, 2012
    Posts:
    62
    For mobile you should to render all stuff to RenderTexture and then draw this texture with GUITexture (or other) with other camera, then make your magic in

    void OnPreRender() {
    Graphics.Blit(renderTexture, out, material);
    }
     
    IgorAherne and UnityLighting like this.
  3. colin299

    colin299

    Joined:
    Sep 2, 2013
    Posts:
    181
    Graphics.Blit is just a convenient call which render a full screen quad.It will not be the big problem.
    I don't use OnRenderImage(...)
    Code (CSharp):
    1. void OnRenderImage(RenderTexture src, RenderTexture dest)
    2. {
    3.     Graphics.Blit(src,dest);
    4. }
    because if you did not supply a RenderTexture to the camera's targetTexture, Unity will trigger CPU ReadPixel(get data back from GPU), which will stall the whole GPU until finish. Super slow, don't do this.

    What you can do is:
    Code (CSharp):
    1. RenderTexture myRenderTexture;
    2. void OnPreRender()
    3. {
    4.     myRenderTexture = RenderTexture.GetTemporary(width,height,16);
    5.     camera.targetTexture = myRenderTexture;
    6. }
    7. void OnPostRender()
    8. {
    9.     camera.targetTexture = null; //null means framebuffer
    10.     Graphics.Blit(myRenderTexture,null as RenderTexture, postProcessMaterial, postProcessMaterialPassNum);
    11.     RenderTexture.ReleaseTemporary(myRenderTexture);
    12. }
    I can build combined bloom imageEffects working 60fps on galaxy S2 & Note2.(Build using Unity5.3.4)

    So I guess your S5 can do better.

    If you are still not reaching 60fps, try the following:
    -using lower resolution RenderTextures
    -lower shader precisions
    -when sampling a texture in fragment shader, try to use the uv directly from vertex shader, because "dependent texture read" is slower.
     
    Last edited: Aug 21, 2016
    zhuhaiyia1, Dance_M, kyuskoj and 16 others like this.
  4. kru2z

    kru2z

    Joined:
    Feb 3, 2017
    Posts:
    2
    could you post full monobehaviour script? or perhaps you have your project somewhere on github? I was trying your method and it works on PC but fails on android. Are there any specific deployment settings that I should consider? I have been struggling with this problem for several days now. please help! thank you

     
  5. colin299

    colin299

    Joined:
    Sep 2, 2013
    Posts:
    181
    whats your result on android? android fail means fully black/pink screen?
     
  6. BonyYousuf

    BonyYousuf

    Joined:
    Aug 22, 2013
    Posts:
    110
    @colin299 Could you please share your code or the project so that we could learn from you.
     
  7. kru2z

    kru2z

    Joined:
    Feb 3, 2017
    Posts:
    2
    hi, sorry - i didnt see notification of your post.
    The myRenderTexture buffer in the OnPreRender function was empty. In other words the camera image was black. The only thing that happens was that the shader effect in the OnPostRender() function is applied to black pixels. But then i solved it when i removed the OnRender function completely.

    My biggest problem is that this solution does not work when used in the stereo mode with vuforia or GVRViewer from google :(.
    When using GVRViewer - the image is displayed with no shader applied.
     
  8. Reanimate_L

    Reanimate_L

    Joined:
    Oct 10, 2009
    Posts:
    2,788
    you mean something like i.texcoord?
     
  9. colin299

    colin299

    Joined:
    Sep 2, 2013
    Posts:
    181
    sorry, I don't have experience in VR (only mobile phones and tablet), so I can't help.
    I guess postprocess is specially handled in VR.
     
  10. colin299

    colin299

    Joined:
    Sep 2, 2013
    Posts:
    181
    yes, you need to use the texture coord directly from interpolation from vertex shader.

    even i.texcoord.zw will count as dependent texture read.
     
    zhuhaiyia1 likes this.
  11. Rusfighter

    Rusfighter

    Joined:
    Jan 18, 2014
    Posts:
    60
    Do you know if this works in deferred rendering path, since it works only in forward path for me. (Onrenderimage works in both)?

    Any fix?

     
  12. Sparrowfc

    Sparrowfc

    Joined:
    Jan 31, 2013
    Posts:
    100
    Genuis! I used to use direct framebuffer to do the PE though I know it's less effeciency than using rendertarget, because it's easier to handle PE swiching on/off on different mobile hardwares. This totally solved that!
     
  13. Lex-DRL

    Lex-DRL

    Joined:
    Oct 10, 2011
    Posts:
    139
    @colin299
    Sorry for re-opening the thread, but... are you sure? From my experience, the dependent texture read is any modification of texture coordinates in pixel shader.

    So tex2D(_Tex, i.texcoord + 0.0h) IS, in fact dependent texture read, while tex2D(_Tex, i.texcoord.xy) is not.

    Am I wrong?
     
    Last edited: Oct 31, 2017
    hippocoder likes this.
  14. brianasu

    brianasu

    Joined:
    Mar 9, 2010
    Posts:
    369
    nat42 likes this.
  15. Lex-DRL

    Lex-DRL

    Joined:
    Oct 10, 2011
    Posts:
    139
    Hm...it's odd.
    When I tested that, I simply forced #pragma target 2.0 and performed texture reads with different arguments. Shader model 2.0 has a hard limit: 4 dependent texure reads as maximum.
    So if I can perform, let's say, 8 texture reads and the shader compiles successfully - it means all of them count as regular texture reads, not as dependent. Otherwise you'll get a compilation error saying: "The maximum of 4 texture indirections is reached" or something like that.
    I got no error when I performed a swizzle, while I did get the error if I do type cast (which is bad for fragment shader anyway).

    But these tests were a long time ago, and UT have changed shader compiler a few times since then. So I may need to re-check it on the up-to-date version of Unity.
     
  16. brianasu

    brianasu

    Joined:
    Mar 9, 2010
    Posts:
    369
    Is there a difference between dependent texture reads vs texture indirections? Maybe they are counted differently.
     
  17. Lex-DRL

    Lex-DRL

    Joined:
    Oct 10, 2011
    Posts:
    139
    So far I didn't manage to find a good clarification of differences between texture indirection and dependent read.
    Some people say one thing, other use these two terms as equivalents in their books.
    Since even Apple call swizzling a "dependent read" (which definitely has nothing to do with sampling texture A by values from B), I assume "texture indirection" = "dependent read". Which also makes sense.

    But I may be wrong and would appreciate if someone correct me.
     
  18. DominoM

    DominoM

    Joined:
    Nov 24, 2016
    Posts:
    460
    There's been a few different circumstances described by "dependant read" that I've come across. My understanding is it's anything that breaks texture prefetching (going into vertex & pixel shaders) in the graphics card. So if you modify the UVs in the pixel shader, it's a dependant read. If you modify them in the vertex shader it's not, unless you use them there rather than just passing to pixel shader. In that case, you break texture prefetching for the vertex shader. Basically it's doing anything that means the prefetched texture values aren't correct when you use them, causing another read.
     
  19. 1ht1baron

    1ht1baron

    Joined:
    Aug 15, 2017
    Posts:
    65
    What should i select as targetTexture? I changed in BlurOptimized.cs. Is everything right in my script ? Also, Hidden/Fastblur is choosen as Blur Shader. Is this right ?

    Code (CSharp):
    1.  public class BlurOptimized : PostEffectsBase
    2.      {
    3.          RenderTexture myRenderTexture;
    4.          public RenderTexture targetTexture;
    5.          Camera camera;
    6.          private int width, height, postProcessMaterialPassNum;
    7.          private Material postProcessMaterial;
    8.          [Range(0, 2)]
    9.          public int downsample = 1;
    10.          public enum BlurType {
    11.              StandardGauss = 0,
    12.              SgxGauss = 1,
    13.          }
    14.          [Range(0.0f, 10.0f)]
    15.          public float blurSize;
    16.          [Range(1, 4)]
    17.          public int blurIterations = 2;
    18.          public BlurType blurType= BlurType.StandardGauss;
    19.          public Shader blurShader = null;
    20.          private Material blurMaterial = null;
    21.  
    22.          public override bool CheckResources () {
    23.              CheckSupport (false);
    24.              blurMaterial = CheckShaderAndCreateMaterial (blurShader, blurMaterial);
    25.              if (!isSupported)
    26.                  ReportAutoDisable ();
    27.              return isSupported;
    28.          }
    29.          public void OnDisable () {
    30.              if (blurMaterial)
    31.                  DestroyImmediate (blurMaterial);
    32.          }
    33.          void OnPreRender()
    34.          {
    35.              myRenderTexture = RenderTexture.GetTemporary(width,height,16);
    36.              camera.targetTexture = myRenderTexture;
    37.          }
    38.          void OnPostRender()
    39.          {
    40.              camera.targetTexture = null; //null means framebuffer
    41.              Graphics.Blit(myRenderTexture,null as RenderTexture, postProcessMaterial, postProcessMaterialPassNum);
    42.              RenderTexture.ReleaseTemporary(myRenderTexture);
    43.          }
    44.      }
     
    Morwin25 likes this.
  20. buFFalo94

    buFFalo94

    Joined:
    Sep 14, 2015
    Posts:
    273
    @colin299 what filter are you using for your bloom shader Box Filter, Gaussian
    or Kawase
     
  21. buFFalo94

    buFFalo94

    Joined:
    Sep 14, 2015
    Posts:
    273
    I've managed to get to optimized bloom to work with OnPreRender and OnPostRender but I don't like the result given by the optimized bloom shader. I want something more ambitious more elegant
     
  22. yasirkula

    yasirkula

    Joined:
    Aug 1, 2011
    Posts:
    2,858
    I can't blit to screen in OnPostRender in Unity 2017.3.0p2. There are no error messages but the post-processing effect just won't work. It works fine while using OnRenderImage. Can anyone verify that @colin299's solution still works in latest Unity versions?
     
  23. buFFalo94

    buFFalo94

    Joined:
    Sep 14, 2015
    Posts:
    273
    It's works for me on 2017.3.1p4
     
  24. yasirkula

    yasirkula

    Joined:
    Aug 1, 2011
    Posts:
    2,858
    Can I see a short snippet from your code, which includes the relevant parts of OnPreRender and OnPostRender?
     
  25. buFFalo94

    buFFalo94

    Joined:
    Sep 14, 2015
    Posts:
    273
    I did not change the original snippet shared by @colin299. All I did is to change variables names. If you need help tell us more about the issue you are encountering
     
  26. yasirkula

    yasirkula

    Joined:
    Aug 1, 2011
    Posts:
    2,858
    For example, I'm trying to optimize this Black & White effect (source):

    Code (CSharp):
    1. public float intensity;
    2. private Material material;
    3.  
    4. void Awake () { material = new Material( Shader.Find("Hidden/BWDiffuse") ); }
    5.  
    6. void OnRenderImage (RenderTexture source, RenderTexture destination)
    7. {
    8.     if (intensity == 0)
    9.     {
    10.         Graphics.Blit (source, destination);
    11.         return;
    12.     }
    13.  
    14.     material.SetFloat("_bwBlend", intensity);
    15.     Graphics.Blit (source, destination, material);
    16. }
    When I change the code like this, it just stops working:

    Code (CSharp):
    1. public float intensity;
    2. private Material material;
    3. RenderTexture myRenderTexture;
    4. Camera cam;
    5.  
    6. void OnEnable ()
    7. {
    8.     material = new Material( Shader.Find("Hidden/BWDiffuse") );
    9.     cam = GetComponent<Camera>();
    10. }
    11.  
    12. void OnPreRender()
    13. {
    14.     myRenderTexture = RenderTexture.GetTemporary( Screen.width, Screen.height, 16 );
    15.     cam.targetTexture = myRenderTexture;
    16. }
    17.  
    18. void OnPostRender()
    19. {
    20.     cam.targetTexture = null; //null means framebuffer
    21.  
    22.     if( intensity == 0 )
    23.         Graphics.Blit( myRenderTexture, null as RenderTexture );
    24.     else
    25.     {
    26.         material.SetFloat( "_bwBlend", intensity );
    27.         Graphics.Blit( myRenderTexture, null as RenderTexture, material );
    28.     }
    29.  
    30.     RenderTexture.ReleaseTemporary( myRenderTexture );
    31. }
    Am I doing something wrong, maybe? There are no other post-processing effects on my camera or in my scene. I've tried changing the value of 16 in GetTemporary but it didn't change anything. And I can verify that intensity is not 0. There are no error/warning/log messages.
     
  27. yasirkula

    yasirkula

    Joined:
    Aug 1, 2011
    Posts:
    2,858
    I've tried the same modified B&W script on 5.6.2f1 and 2017.2.0f3; it didn't work on those versions either.
     
  28. buFFalo94

    buFFalo94

    Joined:
    Sep 14, 2015
    Posts:
    273
    Everything seems to be correct so try to use forward render path uncheck HDR and MSAA
     
  29. yasirkula

    yasirkula

    Joined:
    Aug 1, 2011
    Posts:
    2,858
    I've finally resolved the issue. After adding the image effect to a new camera object in a new scene and disabling HDR and MSAA, it worked!

    So, why did it not work for an existing camera in my scene, even though both HDR and MSAA are disabled? The answer is, if forceIntoRenderTexture is enabled on the camera, it will prevent OnPreRender/OnPostRender from working. I didn't enable this setting myself, so something probably triggered it automatically. To disable this property, I've added this to the OnEnable function:
    cam.forceIntoRenderTexture = false;
     
    Last edited: Jun 9, 2018
  30. PrisedRabbit

    PrisedRabbit

    Joined:
    Aug 14, 2012
    Posts:
    62
    DavidSWu likes this.
  31. Kumo-Kairo

    Kumo-Kairo

    Joined:
    Sep 2, 2013
    Posts:
    343
    No, command buffer is a purely C# thing, it doesn't even know which shaders you use for rendering. It's just a convenient and clean way of defining your graphics pipeline.
     
    Last edited: Jun 15, 2018
  32. Kumo-Kairo

    Kumo-Kairo

    Joined:
    Sep 2, 2013
    Posts:
    343
    Btw, MSAAx4 is almost free on tile-based mobiles as it's hardware accelerated. Some Mali GPUs can even go as far as MSAAx16 on hardware.
    "Popular" non-tile-based GPUs are all of the Tegras up to K1 (can't say for X1 for sure). And these GPUs are either powerful enough to run "fakey" MSAA (like K1) or almost nonexistent to care (Tegra 3, Tegra 4).
     
  33. buFFalo94

    buFFalo94

    Joined:
    Sep 14, 2015
    Posts:
    273
    Yep but even with MSAA unchecked you can use MSAA. I didn't say he can't we all know MSAA is almost performance free on tile based GPU
     
    DavidSWu and Kumo-Kairo like this.
  34. DavidSWu

    DavidSWu

    Joined:
    Jun 20, 2016
    Posts:
    183
    With the new lightweight rendering pipeline they don't allow you to use MSAA if you have it unchecked. I am not sure why they put that restriction in, you can edit the code and change that if you really want it.
    We do our 3D rendering to a relatively low res RT with MSAA, then postprocess to the frame buffer with higher res and no MSAA then draw UI at higher res.
     
  35. DavidSWu

    DavidSWu

    Joined:
    Jun 20, 2016
    Posts:
    183
    How do you use Command Buffers to make post processing more efficient?
    I can see using Compute shaders if you do things that require multiple samples that can be reused like blur, bloom, DOF, etc but I am not sure where you can efficiency with Command Buffers.
     
  36. buFFalo94

    buFFalo94

    Joined:
    Sep 14, 2015
    Posts:
    273
    Okay but I haven't had time to test LW Pipeline we haven't upgraded our current project I will test it ASAP. By the way we do the same we are rendering the UI at full res and the scene at lower res(if needed to maintain the frame rate)
     
  37. Kumo-Kairo

    Kumo-Kairo

    Joined:
    Sep 2, 2013
    Posts:
    343
    Command Buffers used for post processing are just a way to make the pipeline more concise (you can pass this command buffer around adding stuff in different places). It doesn't really have any performance benefits in this context as all you usually need from only mirrors the Graphics or GL stuff (like Graphics.Blit, or Graphics.DrawMeshNow for custom blitting, Material.SetPass etc., general OpelGL-ish state machine stuff). It's also a bit easier to convert a custom post-processing pipeline to be usable as a part of the PP StackV2 (which is required to make it work with LWRP without customizing the LWRP code itself). But other than that it doesn't matter whether you use CB for your PP Pipeline or not.
     
    swanickj and DavidSWu like this.
  38. DavidSWu

    DavidSWu

    Joined:
    Jun 20, 2016
    Posts:
    183
    Thanks very helpful!
     
  39. bluescrn

    bluescrn

    Joined:
    Feb 25, 2013
    Posts:
    641
    Is anyone still using OnPreRender/OnPostRender for mobile postprocessing?, and can you get it to work with multisampling?

    Pretty sure that back in 2017.x this all worked fine, but I'm trying to reimplement something similar in 2018.4

    I can get it to work in the editor (so long as camera.allowMSAA=false, and only the camera's rendertexture is set to use MSAA), but on iOS (Metal) it's behaving strangely - looks like it's not resolving the MSAA before my postprocessing code tries to sample the texture, and I'm sampling a pure black texture - but somehow the non-postprocessed image is being blitted to the screen.

    It all works fine with MSAA disabled, though.

    edit:

    After a bit of experimenting, I realised that you can use OnRenderImage to postprocess from a camera-attached RenderTexture straight to the framebuffer if you just do this:

    Code (CSharp):
    1. void OnPreRender()
    2. {
    3.     m_camera.targetTexture = m_primaryRT;  // Can be reduced-resolution, and/or multisampled
    4. }
    5.  
    6. void OnPostRender()
    7. {
    8.     m_camera.targetTexture = null;
    9. }
    So OnRenderImage can be used while working with a reduced-resolution main rendertarget without extra blitting, something I didn't think was do-able. This approach seems to behave on iOS, and the frame debugger doesn't seem to show any non-essential blitting, at least in the editor.
     
    Last edited: Dec 9, 2019
  40. eliphat

    eliphat

    Joined:
    Jul 4, 2017
    Posts:
    48
    Hi everyone! I'm using the OnPreRender / OnPostRender to do post processing. However, it seems that there is a 'Grab Render Texture' Call in Frame Debugger which discards all content that previous cameras have rendered. (I'm using several cameras and I only apply the script on the last camera.)
     
  41. nindim

    nindim

    Joined:
    Jan 22, 2013
    Posts:
    130
    Hey @bluescrn,

    You mention OnRenderImage but then the code shows OnPostRender, I'm a bit confused, can you elaborate, please?

    Thanks,

    Niall

     
  42. bluescrn

    bluescrn

    Joined:
    Feb 25, 2013
    Posts:
    641
    That code was in addition to the OnRenderImage code - IIRC it was just a bit of a workaround to get the postprocessing working together with MSAA on iOS. (for a now-abandoned prototype, so I've not done anything with it recently)