Search Unity

  1. Tune in to get all the latest Unity news live from Berlin on June 19, 6pm CEST. Set up a YouTube reminder!
    Dismiss Notice
  2. Unity 2018.1 has arrived! Read about it here
    Dismiss Notice
  3. Scriptable Render Pipeline improvements, Texture Mipmap Streaming, and more! Check out what we have in store for you in the 2018.2 Beta.
    Dismiss Notice
  4. Improve your Unity skills with a certified instructor in a private, interactive classroom. Learn more.
    Dismiss Notice
  5. ARCore is out of developer preview! Read about it here.
    Dismiss Notice
  6. Magic Leap’s Lumin SDK Technical Preview for Unity lets you get started creating content for Magic Leap One™. Find more information on our blog!
    Dismiss Notice
  7. Want to see the most recent patch releases? Take a peek at the patch release page.
    Dismiss Notice

Post Process Mobile Performance : Alternatives To Graphics.Blit , OnRenderImage ?

Discussion in 'Image Effects' started by TreasureMap, Jul 1, 2016.

  1. TreasureMap

    TreasureMap

    Joined:
    Apr 23, 2014
    Posts:
    14
    I am creating an outline glow (aura) effect for a mobile game (android) and have noticed that the cost of a graphics.blit is quite high. Even only doing a "blit(source,dest)" and nothing else is slow (-5~-7fps).

    I wanted to know if there are any methods / techniques that can reduce the decrease in frames ?
    Any alternatives to using graphics.blit ? (Render to a full screen quad ?, Render MainCamera To RT, then do post work, then composite and blit to screen ?)

    I can manage to get around 40 FPS, but its not an acceptable frame rate.
    If there really is no way to do Post Render work on mobile, I will go a different route.

    About my setup:
    Testing on a Samsung Galaxy S5 with Android 5.
    I have a post effect script attached to the main camera.
    Main Camera does NOT render to an RT. It just renders normally.
    I am doing my post effect work in "OnRenderImage".
    I am using 2 or 3 temporary render textures at runtime (RenderTexture.GetTemporary)
    I have a color RenderTexture and a depth RenderTexture. The size is a quarter of the source RT.
    I have some CommandBuffers set up to render the models that I want to create the outline glow effect for.
    The models all get rendered to my temp RT, then the RT is blurred, then composited with the source RT.
     
    Last edited: Jul 1, 2016
  2. mrbroshkin

    mrbroshkin

    Joined:
    Aug 14, 2012
    Posts:
    35
    For mobile you should to render all stuff to RenderTexture and then draw this texture with GUITexture (or other) with other camera, then make your magic in

    void OnPreRender() {
    Graphics.Blit(renderTexture, out, material);
    }
     
    LightingBox2 likes this.
  3. colin299

    colin299

    Joined:
    Sep 2, 2013
    Posts:
    122
    Graphics.Blit is just a convenient call which render a full screen quad.It will not be the big problem.
    I don't use OnRenderImage(...)
    Code (CSharp):
    1. {
    2.     Graphics.Blit(src,dest);
    3. }
    because if you did not supply a RenderTexture to the camera's targetTexture, Unity will trigger CPU ReadPixel(get data back from GPU), which will stall the whole GPU until finish. Super slow, don't do this.

    What you can do is:
    Code (CSharp):
    1. RenderTexture myRenderTexture;
    2. {
    3.     myRenderTexture = RenderTexture.GetTemporary(width,height,16);
    4.     camera.targetTexture = myRenderTexture;
    5. }
    6. {
    7.     camera.targetTexture = null; //null means framebuffer
    8.     Graphics.Blit(myRenderTexture,null as RenderTexture, postProcessMaterial, postProcessMaterialPassNum);
    9.     RenderTexture.ReleaseTemporary(myRenderTexture);
    10. }
    I can build combined bloom imageEffects working 60fps on galaxy S2 & Note2.(Build using Unity5.3.4)

    So I guess your S5 can do better.

    If you are still not reaching 60fps, try the following:
    -using lower resolution RenderTextures
    -lower shader precisions
    -when sampling a texture in fragment shader, try to use the uv directly from vertex shader, because "dependent texture read" is slower.
     
    Last edited: Aug 21, 2016
    cusr, buFFalo94, dyupa and 10 others like this.
  4. kru2z

    kru2z

    Joined:
    Feb 3, 2017
    Posts:
    2
    could you post full monobehaviour script? or perhaps you have your project somewhere on github? I was trying your method and it works on PC but fails on android. Are there any specific deployment settings that I should consider? I have been struggling with this problem for several days now. please help! thank you

     
  5. colin299

    colin299

    Joined:
    Sep 2, 2013
    Posts:
    122
    whats your result on android? android fail means fully black/pink screen?
     
  6. BonyYousuf

    BonyYousuf

    Joined:
    Aug 22, 2013
    Posts:
    104
    @colin299 Could you please share your code or the project so that we could learn from you.
     
  7. kru2z

    kru2z

    Joined:
    Feb 3, 2017
    Posts:
    2
    hi, sorry - i didnt see notification of your post.
    The myRenderTexture buffer in the OnPreRender function was empty. In other words the camera image was black. The only thing that happens was that the shader effect in the OnPostRender() function is applied to black pixels. But then i solved it when i removed the OnRender function completely.

    My biggest problem is that this solution does not work when used in the stereo mode with vuforia or GVRViewer from google :(.
    When using GVRViewer - the image is displayed with no shader applied.
     
  8. Reanimate_L

    Reanimate_L

    Joined:
    Oct 10, 2009
    Posts:
    2,052
    you mean something like i.texcoord?
     
  9. colin299

    colin299

    Joined:
    Sep 2, 2013
    Posts:
    122
    sorry, I don't have experience in VR (only mobile phones and tablet), so I can't help.
    I guess postprocess is specially handled in VR.
     
  10. colin299

    colin299

    Joined:
    Sep 2, 2013
    Posts:
    122
    yes, you need to use the texture coord directly from interpolation from vertex shader.

    even i.texcoord.zw will count as dependent texture read.
     
  11. Rusfighter

    Rusfighter

    Joined:
    Jan 18, 2014
    Posts:
    60
    Do you know if this works in deferred rendering path, since it works only in forward path for me. (Onrenderimage works in both)?

    Any fix?

     
  12. Sparrowfc

    Sparrowfc

    Joined:
    Jan 31, 2013
    Posts:
    99
    Genuis! I used to use direct framebuffer to do the PE though I know it's less effeciency than using rendertarget, because it's easier to handle PE swiching on/off on different mobile hardwares. This totally solved that!
     
  13. Lex-DRL

    Lex-DRL

    Joined:
    Oct 10, 2011
    Posts:
    130
    @colin299
    Sorry for re-opening the thread, but... are you sure? From my experience, the dependent texture read is any modification of texture coordinates in pixel shader.

    So tex2D(_Tex, i.texcoord + 0.0h) IS, in fact dependent texture read, while tex2D(_Tex, i.texcoord.xy) is not.

    Am I wrong?
     
    Last edited: Oct 31, 2017
    hippocoder likes this.
  14. brianasu

    brianasu

    Joined:
    Mar 9, 2010
    Posts:
    367
    nat42 likes this.
  15. Lex-DRL

    Lex-DRL

    Joined:
    Oct 10, 2011
    Posts:
    130
    Hm...it's odd.
    When I tested that, I simply forced #pragma target 2.0 and performed texture reads with different arguments. Shader model 2.0 has a hard limit: 4 dependent texure reads as maximum.
    So if I can perform, let's say, 8 texture reads and the shader compiles successfully - it means all of them count as regular texture reads, not as dependent. Otherwise you'll get a compilation error saying: "The maximum of 4 texture indirections is reached" or something like that.
    I got no error when I performed a swizzle, while I did get the error if I do type cast (which is bad for fragment shader anyway).

    But these tests were a long time ago, and UT have changed shader compiler a few times since then. So I may need to re-check it on the up-to-date version of Unity.
     
  16. brianasu

    brianasu

    Joined:
    Mar 9, 2010
    Posts:
    367
    Is there a difference between dependent texture reads vs texture indirections? Maybe they are counted differently.
     
  17. Lex-DRL

    Lex-DRL

    Joined:
    Oct 10, 2011
    Posts:
    130
    So far I didn't manage to find a good clarification of differences between texture indirection and dependent read.
    Some people say one thing, other use these two terms as equivalents in their books.
    Since even Apple call swizzling a "dependent read" (which definitely has nothing to do with sampling texture A by values from B), I assume "texture indirection" = "dependent read". Which also makes sense.

    But I may be wrong and would appreciate if someone correct me.
     
  18. DominoM

    DominoM

    Joined:
    Nov 24, 2016
    Posts:
    412
    There's been a few different circumstances described by "dependant read" that I've come across. My understanding is it's anything that breaks texture prefetching (going into vertex & pixel shaders) in the graphics card. So if you modify the UVs in the pixel shader, it's a dependant read. If you modify them in the vertex shader it's not, unless you use them there rather than just passing to pixel shader. In that case, you break texture prefetching for the vertex shader. Basically it's doing anything that means the prefetched texture values aren't correct when you use them, causing another read.
     
  19. 1ht1baron

    1ht1baron

    Joined:
    Aug 15, 2017
    Posts:
    65
    What should i select as targetTexture? I changed in BlurOptimized.cs. Is everything right in my script ? Also, Hidden/Fastblur is choosen as Blur Shader. Is this right ?

    Code (CSharp):
    1.  public class BlurOptimized : PostEffectsBase
    2.      {
    3.          RenderTexture myRenderTexture;
    4.          public RenderTexture targetTexture;
    5.          Camera camera;
    6.          private int width, height, postProcessMaterialPassNum;
    7.          private Material postProcessMaterial;
    8.          [Range(0, 2)]
    9.          public int downsample = 1;
    10.          public enum BlurType {
    11.              StandardGauss = 0,
    12.              SgxGauss = 1,
    13.          }
    14.          [Range(0.0f, 10.0f)]
    15.          public float blurSize;
    16.          [Range(1, 4)]
    17.          public int blurIterations = 2;
    18.          public BlurType blurType= BlurType.StandardGauss;
    19.          public Shader blurShader = null;
    20.          private Material blurMaterial = null;
    21.  
    22.          public override bool CheckResources () {
    23.              CheckSupport (false);
    24.              blurMaterial = CheckShaderAndCreateMaterial (blurShader, blurMaterial);
    25.              if (!isSupported)
    26.                  ReportAutoDisable ();
    27.              return isSupported;
    28.          }
    29.          public void OnDisable () {
    30.              if (blurMaterial)
    31.                  DestroyImmediate (blurMaterial);
    32.          }
    33.          void OnPreRender()
    34.          {
    35.              myRenderTexture = RenderTexture.GetTemporary(width,height,16);
    36.              camera.targetTexture = myRenderTexture;
    37.          }
    38.          void OnPostRender()
    39.          {
    40.              camera.targetTexture = null; //null means framebuffer
    41.              Graphics.Blit(myRenderTexture,null as RenderTexture, postProcessMaterial, postProcessMaterialPassNum);
    42.              RenderTexture.ReleaseTemporary(myRenderTexture);
    43.          }
    44.      }
     
    Morwin25 likes this.
  20. buFFalo94

    buFFalo94

    Joined:
    Sep 14, 2015
    Posts:
    145
    @colin299 what filter are you using for your bloom shader Box Filter, Gaussian
    or Kawase
     
  21. buFFalo94

    buFFalo94

    Joined:
    Sep 14, 2015
    Posts:
    145
    I've managed to get to optimized bloom to work with OnPreRender and OnPostRender but I don't like the result given by the optimized bloom shader. I want something more ambitious more elegant
     
  22. yasirkula

    yasirkula

    Joined:
    Aug 1, 2011
    Posts:
    394
    I can't blit to screen in OnPostRender in Unity 2017.3.0p2. There are no error messages but the post-processing effect just won't work. It works fine while using OnRenderImage. Can anyone verify that @colin299's solution still works in latest Unity versions?
     
  23. buFFalo94

    buFFalo94

    Joined:
    Sep 14, 2015
    Posts:
    145
    It's works for me on 2017.3.1p4
     
  24. yasirkula

    yasirkula

    Joined:
    Aug 1, 2011
    Posts:
    394
    Can I see a short snippet from your code, which includes the relevant parts of OnPreRender and OnPostRender?
     
  25. buFFalo94

    buFFalo94

    Joined:
    Sep 14, 2015
    Posts:
    145
    I did not change the original snippet shared by @colin299. All I did is to change variables names. If you need help tell us more about the issue you are encountering
     
  26. yasirkula

    yasirkula

    Joined:
    Aug 1, 2011
    Posts:
    394
    For example, I'm trying to optimize this Black & White effect (source):

    Code (CSharp):
    1. public float intensity;
    2. private Material material;
    3.  
    4. void Awake () { material = new Material( Shader.Find("Hidden/BWDiffuse") ); }
    5.  
    6. {
    7.     if (intensity == 0)
    8.     {
    9.         Graphics.Blit (source, destination);
    10.         return;
    11.     }
    12.  
    13.     material.SetFloat("_bwBlend", intensity);
    14.     Graphics.Blit (source, destination, material);
    15. }
    When I change the code like this, it just stops working:

    Code (CSharp):
    1. public float intensity;
    2. private Material material;
    3. RenderTexture myRenderTexture;
    4. Camera cam;
    5.  
    6. void OnEnable ()
    7. {
    8.     material = new Material( Shader.Find("Hidden/BWDiffuse") );
    9.     cam = GetComponent<Camera>();
    10. }
    11.  
    12. {
    13.     myRenderTexture = RenderTexture.GetTemporary( Screen.width, Screen.height, 16 );
    14.     cam.targetTexture = myRenderTexture;
    15. }
    16.  
    17. {
    18.     cam.targetTexture = null; //null means framebuffer
    19.  
    20.     if( intensity == 0 )
    21.         Graphics.Blit( myRenderTexture, null as RenderTexture );
    22.     else
    23.     {
    24.         material.SetFloat( "_bwBlend", intensity );
    25.         Graphics.Blit( myRenderTexture, null as RenderTexture, material );
    26.     }
    27.  
    28.     RenderTexture.ReleaseTemporary( myRenderTexture );
    29. }
    Am I doing something wrong, maybe? There are no other post-processing effects on my camera or in my scene. I've tried changing the value of 16 in GetTemporary but it didn't change anything. And I can verify that intensity is not 0. There are no error/warning/log messages.
     
  27. yasirkula

    yasirkula

    Joined:
    Aug 1, 2011
    Posts:
    394
    I've tried the same modified B&W script on 5.6.2f1 and 2017.2.0f3; it didn't work on those versions either.
     
  28. buFFalo94

    buFFalo94

    Joined:
    Sep 14, 2015
    Posts:
    145
    Everything seems to be correct so try to use forward render path uncheck HDR and MSAA
     
  29. yasirkula

    yasirkula

    Joined:
    Aug 1, 2011
    Posts:
    394
    I've finally resolved the issue. After adding the image effect to a new camera object in a new scene and disabling HDR and MSAA, it worked!

    So, why did it not work for an existing camera in my scene, even though both HDR and MSAA are disabled? The answer is, if forceIntoRenderTexture is enabled on the camera, it will prevent OnPreRender/OnPostRender from working. I didn't enable this setting myself, so something probably triggered it automatically. To disable this property, I've added this to the OnEnable function:
    cam.forceIntoRenderTexture = false;
     
    Last edited: Jun 9, 2018
    buFFalo94 likes this.
  30. mrbroshkin

    mrbroshkin

    Joined:
    Aug 14, 2012
    Posts:
    35
  31. Kumo-Kairo

    Kumo-Kairo

    Joined:
    Sep 2, 2013
    Posts:
    267
    No, command buffer is a purely C# thing, it doesn't even know which shaders you use for rendering. It's just a convenient and clean way of defining your graphics pipeline.
     
    Last edited: Jun 15, 2018 at 2:00 PM
  32. Kumo-Kairo

    Kumo-Kairo

    Joined:
    Sep 2, 2013
    Posts:
    267
    Btw, MSAAx4 is almost free on tile-based mobiles as it's hardware accelerated. Some Mali GPUs can even go as far as MSAAx16 on hardware.
    "Popular" non-tile-based GPUs are all of the Tegras up to K1 (can't say for X1 for sure). And these GPUs are either powerful enough to run "fakey" MSAA (like K1) or almost nonexistent to care (Tegra 3, Tegra 4).
     
  33. buFFalo94

    buFFalo94

    Joined:
    Sep 14, 2015
    Posts:
    145
    Yep but even with MSAA unchecked you can use MSAA. I didn't say he can't we all know MSAA is almost performance free on tile based GPU
     
    Kumo-Kairo likes this.