Hello, Does any experimented user know what is the best way to achieve simple image effects for a mobile game? Currently I'm using a white square sprite with a custom shader like this: Code (CSharp): // Upgrade NOTE: replaced 'mul(UNITY_MATRIX_MVP,*)' with 'UnityObjectToClipPos(*)' Shader "Custom/UI/Invert" { Properties { [PerRendererData] _MainTex ("Sprite Texture (A)", 2D) = "white" {} _Color ("Tint", Color) = (1,1,1,1) _StencilComp("Stencil Comparison", Float) = 8 _Stencil("Stencil ID", Float) = 0 _StencilOp("Stencil Operation", Float) = 0 _StencilWriteMask("Stencil Write Mask", Float) = 255 _StencilReadMask("Stencil Read Mask", Float) = 255 _ColorMask ("Color Mask", Float) = 15 } SubShader { Tags { "Queue"="Transparent" "IgnoreProjector"="True" "RenderType"="Transparent" "PreviewType"="Plane" "CanUseSpriteAtlas"="True" } Stencil { Ref [_Stencil] Comp [_StencilComp] Pass [_StencilOp] ReadMask [_StencilReadMask] WriteMask [_StencilWriteMask] } Cull Off Lighting Off ZWrite Off ZTest [unity_GUIZTestMode] Fog { Mode Off } ColorMask [_ColorMask] // invert color blend mode Blend OneMinusDstColor OneMinusSrcAlpha BlendOp Add Pass { CGPROGRAM #pragma vertex vert #pragma fragment frag #include "UnityCG.cginc" struct appdata_t { float4 vertex : POSITION; fixed4 color : COLOR; half2 texcoord : TEXCOORD0; }; struct v2f { float4 vertex : SV_POSITION; fixed4 color : COLOR; half2 texcoord : TEXCOORD0; }; fixed4 _Color; sampler2D _MainTex; v2f vert(appdata_t IN) { v2f OUT; OUT.vertex = UnityObjectToClipPos(IN.vertex); OUT.texcoord = IN.texcoord; OUT.color = IN.color * _Color; return OUT; } fixed4 frag(v2f IN) : SV_Target { fixed4 color = tex2D(_MainTex, IN.texcoord) * IN.color; return color; } ENDCG } } } This one is for Inverting color( I haven't succeed to make a Greyscale BTW if someone can help). This method is the best for performance on mobile right?
I'm interested in this too. My need is actually to convert the image to grayscale on Quest, and obviously it needs to do so without a huge drag in performance.
Me it's only for 2D but still cannot know what is the more performant way for mobile. I'm a bit surprised that this is not more discussed. Do people just don't use any image effects when they build for mobile?
I found a hint of a suggestion here: https://forums.oculusvr.com/develop...ocessing-or-fullscreen-image-effects-on-quest Still looking for a code example of this approach.
OK, the following mostly works (applied to a quad placed in front of the camera). But it doesn't affect the skybox. Still need to figure that out. Code (CSharp): Shader "Color FX/Green" { Properties { // _Color ("Color", Color) = (1,1,1,1) } SubShader { Tags { "RenderType"="Overlay" "PreviewType"="Plane" } Pass { Zwrite Off CGPROGRAM #pragma vertex vert #pragma fragment frag // only compile Shader for platforms that can potentially // do it (currently gles,gles3,metal) #pragma only_renderers framebufferfetch struct appdata { float4 vertex : POSITION; float2 uv : TEXCOORD0; }; struct v2f { float4 vertex : SV_POSITION; }; v2f vert (appdata v) { v2f o; o.vertex = UnityObjectToClipPos(v.vertex); return o; } void frag (v2f v, inout half4 ocol : SV_Target) { // ocol can be read (current framebuffer color) // and written into (will change color to that one) ocol.g = (ocol.r + ocol.g + ocol.b) * 0.33f; ocol.r = ocol.b = 0; } ENDCG } } Fallback Off }
Hi! Interesting. What kind of shader do you create to make it work? Do you then create a material that you put on the quad? Still, it would be very helpfull to have a answer from Unity about this topic. Is it better to avoid completely image effect on mobile CPU? Even if the answer is yes, it's better to have a clear answer about this.
Unity's explained this many times in it's documentation, on the blog and so on. You are being a bit lazy. I don't mind that, as it's a smart way to get good information. However in this topic, it's been beaten to death. The forums are stuffed with mobile post fx advice as is the asset store. For Quest you want to just modify the shaders, this will be fastest. You don't need a whole full screen blit if you're changing the colour of all the meshes being rendered. Code (CSharp): float4 colour = tex2D(_colour, i.uv); colour.rgb = lerp(Luminance(colour.rgb), colour.rgb, _colourSaturation); _colourSaturation would be a number you pass into the shader between 0 and 1. Code (CSharp): inline float Luminance(float3 c) { return dot(c, float3(0.22, 0.707, 0.071)); } The Luminance function is extremely fast. This method would be essentially close to zero cost on most hardware, while allowing you to choose how much saturation you have (you can overboost it going past with _colourSaturation or greyscale if 0). So basically modify the original shader if you care about performance, or you pay a fairly heavy price since you would have to ask URP (or builtin) to provide a Color buffer, which involves blitting the backbuffer to a texture, then blitting that texture with a shader. This is how all post fx is done usually. Some effects can't be done in-shader, and if you are using shader graph then lighting will still colour the image as there is no access to the final pixel colour from graphs at the time of writing.
Well, modifying every shader that might be on screen is great if you can manage it. In my case that feels impractical — we have terrains rendered with MicroSplat, a dynamic sky with its own shaders, UI (including TextMeshPro) of course doing its own thing, and then the half-dozen different shaders we use for various purposes. The framebuffer fetch approach seems to work great though (and I was able to get it to affect the skybox, too, after correcting the rendering queue), and it doesn't require touching everything else that renders to the screen. And for desktop (or other platforms that don't support framebuffer fetch), I just made a second subshader that uses GrabPass. Life is good.
Seems to do fine in its minimal configuration. The docs include some helpful notes on performance optimization. We're not using any add-ons except procedural texturing, and even that we will probably bake (thus becoming non-procedural) when we get a little further along.