Search Unity

Texture Painting performance

Discussion in 'General Graphics' started by ciwolsey, Jun 13, 2016.

  1. ciwolsey

    ciwolsey

    Joined:
    Jun 13, 2016
    Posts:
    29
    I'm developing a virtual whiteboard in VR. You can see an example of what I'm trying to do here:


    It works great, but if I draw a line too quickly I end up with a dotted line rather than a solid line. I'm using SetPixels and Apply() which I'm told is very slow - however the framerate seems to be buttery smooth - if Apply() really was the issue wouldn't I see low framerates? I'm using raycasting to get the coordinates the pen should draw to on the whiteboard.

    An alternative I heard of is using a RenderTexture and shaders. I have a very basic understanding of shaders and Render Textures but I can't seem to work out how to avoid using Apply(). I've gone through so many posts and tutorials that say using Render Textures and a shader is the way but none of them actually explain how to do it. A detailed explanation or even a minimal demo project would be extremely appreciated.
     
  2. ciwolsey

    ciwolsey

    Joined:
    Jun 13, 2016
    Posts:
    29
    Nobody knows? Or is my explanation bad?
     
  3. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,343
    The key bit of understanding for using render textures is the blit function.

    In short you create a render texture, set it to be the texture on your surface, then "draw" to it using a Graphics.Blit on your render texture using a custom shader, then you're done.

    Obviously that's greatly simplifying it, but that should get you looking in the right direction.

    The reason why people say to avoid using SetPixels and Apply is Apply has to copy the entire texture from the CPU side to the GPU where as using a render texture and blit you can send only the minimal data necessary. In your case a position on the texture, the color, and the radius to draw. You would have to then write a shader that can draw a circle with that data.

    There are a number of examples on using blit to draw stuff as well as circle shaders on this forum and elsewhere on the internet so I suggest you keep looking. If you're having problems getting something working once you've made a pass on it come back.

    Honestly if for your use you're not seeing any issues with SetPixels and Apply you're probably fine continuing to use that.
     
    Last edited: Jun 16, 2016
    Martin_H likes this.
  4. ciwolsey

    ciwolsey

    Joined:
    Jun 13, 2016
    Posts:
    29
    Thanks for the reply.

    The reason I want to improve the performance is I need to use a much higher resolution than I am, but when I step up the resolution with the SetPixels/Apply method frame rate becomes poor.

    I'm already been experimenting with what you suggested and I have the shader drawing the blitted texture at the pens location. The problem is the brush texture just moves around wherever I point the pen, previous textures are cleared out.

    Ill try to get a video to make it clearer but essentially I need to keep the the previous frame, and keep stamping the new texture onto it in the current frame. Right now I draw the blitted texture, and then it appears to be erased, and I draw another which results in a sort of laser pen behavior rather than an actual pen which would leave a trail of ink.
     
    Last edited: Jun 17, 2016
  5. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,343
    You should be creating a single render texture and reusing it over and over. If you're creating a render texture each frame or using a temporary render texture those will be cleared each frame.

    If you're doing that it might be a problem with your shader. You'll want your shader to be using an alpha blend (Blend SrcAlpha OneMinusSrcAlpha). Also depending on your shader you probably want the source texture when drawing the line to be a dummy texture; the shader should ignore _MainTex or maybe it can be the brush shape. It should not be the render texture itself. I suggest using Texture2D.whiteTexture as the source.

    If you can post some snippets of the code you're using we can help more.
     
  6. ciwolsey

    ciwolsey

    Joined:
    Jun 13, 2016
    Posts:
    29
    I'll try that and if I have no luck I'll paste some code. Thanks... and I am only using a single render texture.
     
  7. ciwolsey

    ciwolsey

    Joined:
    Jun 13, 2016
    Posts:
    29
    Last edited: Jun 17, 2016
  8. ciwolsey

    ciwolsey

    Joined:
    Jun 13, 2016
    Posts:
    29
    If I blit a checker texture to the rendertexture and switch to a standard shader I can see the checkerboard, but when I switch to my shader I don't see any part of the rendertextures contents at all.
     
  9. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,343
    Were you using a surface shader when you were using blit before? You cannot use a surface shader with blit and get expected results, you want to use as "dumb" a shader as possible, start with a unlit shader instead of a surface shader. A surface shader can be used for rendering the resulting board in the game view (and use the render texture as it's albedo) but not the blit.

    In the case of using a render texture as the main texture of a surface shader and not seeing anything get stored from frame to frame that is entirely expected because you're never writing to the surface shader. It is not possible to read from and write to the same texture on GPUs*, and a shader that's rendering something to the screen is rendering to the screen's render buffer and not the render texture being passed to it.

    You have to use blit, or you can go a greatly more circuitous path by setting a render texture as a camera's target and draw things to the camera, or you can use SetPixel.
     
  10. ciwolsey

    ciwolsey

    Joined:
    Jun 13, 2016
    Posts:
    29
    So I'd need to convert this to a fragment shader right?
     
  11. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,343
    If you're using blit, yes.
     
  12. ciwolsey

    ciwolsey

    Joined:
    Jun 13, 2016
    Posts:
    29
    And I have to make sure my rendertexture isn't being displayed anywhere in my scene? So it should not appear on a single material? Except as an input to the shader?
     
  13. ciwolsey

    ciwolsey

    Joined:
    Jun 13, 2016
    Posts:
    29
    Is there any chance you could produce a minimal project for me? I feel like if I miss the tiniest thing it won't work and I could go on trying to describe what I have here but not very effectively.

    I've converted to a fragment shader and I'm getting the exact same results as before so there must be something I'm doing fundamentally wrong. It just has the laser pointer rather than pen behaviour.
     
  14. ciwolsey

    ciwolsey

    Joined:
    Jun 13, 2016
    Posts:
    29
    Here's the fragment shader:
    Code (CSharp):
    1. Shader "fragshader"
    2. {
    3.     Properties
    4.     {
    5.         _penColor("Color", Color) = (1,1,1,1)
    6.         _MainTex ("Texture", 2D) = "white" {}
    7.         _penX("Pen X", Range(0,1)) = 0.0
    8.         _penY("Pen Y", Range(0,1)) = 0.0
    9.     }
    10.     SubShader
    11.     {
    12.         Tags { "RenderType"="Opaque" }
    13.         LOD 100
    14.         Blend SrcAlpha OneMinusSrcAlpha
    15.         Pass
    16.         {
    17.             CGPROGRAM
    18.             #pragma vertex vert
    19.             #pragma fragment frag
    20.             // make fog work
    21.             #pragma multi_compile_fog
    22.          
    23.             #include "UnityCG.cginc"
    24.  
    25.             struct appdata
    26.             {
    27.                 float4 vertex : POSITION;
    28.                 float2 uv : TEXCOORD0;
    29.             };
    30.  
    31.             struct v2f
    32.             {
    33.                 float2 uv : TEXCOORD0;
    34.                 UNITY_FOG_COORDS(1)
    35.                 float4 vertex : SV_POSITION;
    36.             };
    37.  
    38.             sampler2D _MainTex;
    39.             float4 _MainTex_ST;
    40.  
    41.             fixed4 _penColor;
    42.             float _penX;
    43.             float _penY;
    44.          
    45.             v2f vert (appdata v)
    46.             {
    47.                 v2f o;
    48.                 o.vertex = mul(UNITY_MATRIX_MVP, v.vertex);
    49.                 o.uv = TRANSFORM_TEX(v.uv, _MainTex);
    50.                 UNITY_TRANSFER_FOG(o,o.vertex);
    51.                 return o;
    52.             }
    53.          
    54.             fixed4 frag (v2f i) : SV_Target
    55.             {
    56.                 float x = i.uv.x;
    57.                 float y = i.uv.y;
    58.  
    59.                 float penSize = 0.01f;
    60.                 float halfPen = penSize / 2;
    61.  
    62.                 if (x > (_penX - halfPen) && x < _penX + halfPen && y >(_penY - halfPen) && y < (_penY + halfPen)) {
    63.                     return _penColor;
    64.                 }
    65.                 else {
    66.                     return tex2D(_MainTex, i.uv);
    67.                 }
    68.  
    69.                 UNITY_APPLY_FOG(i.fogCoord, col);
    70.             }
    71.             ENDCG
    72.         }
    73.     }
    74. }
    75.  
     
  15. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,343
    No. Rendering is many, many hundreds, sometimes hundreds of thousands of steps. Running the fragment portion of a shaded for a single pixel is one of those steps, and during that step the same render texture cannot both be read from and written to as the output of the shader.

    The fragment shader part of your blit shader should never return the main texture, it should just return a color with alpha set to zero. The blit is a very direct way to render something to a render buffer, in this case it's more like drawing a transparent texture into the scene. The main difference between this, and say a particle effect, is the texture of the pen point is being done in the shader rather than the usual texture, and the target isn't a camera's view but a render texture. Also because it's transparent and the render texture is being kept from frame to frame rather than cleared (which is what cameras do by default) it should build up over time.

    Code (CSharp):
    1. Shader "Pen Blit"
    2. {
    3.     Properties
    4.     {
    5.         _penColor("Color", Color) = (1,1,1,1)
    6.         _penX("Pen X", Range(0,1)) = 0.0
    7.         _penY("Pen Y", Range(0,1)) = 0.0
    8.     }
    9.     SubShader
    10.     {
    11.         Blend SrcAlpha OneMinusSrcAlpha
    12.         // No culling or depth
    13.         Cull Off ZWrite Off ZTest Always
    14.  
    15.         Pass
    16.         {
    17.             CGPROGRAM
    18.             #pragma vertex vert
    19.             #pragma fragment frag
    20.          
    21.             #include "UnityCG.cginc"
    22.  
    23.             struct appdata
    24.             {
    25.                 float4 vertex : POSITION;
    26.                 float2 uv : TEXCOORD0;
    27.             };
    28.  
    29.             struct v2f
    30.             {
    31.                 float2 uv : TEXCOORD0;
    32.                 float4 vertex : SV_POSITION;
    33.             };
    34.  
    35.             v2f vert (appdata v)
    36.             {
    37.                 v2f o;
    38.                 o.vertex = mul(UNITY_MATRIX_MVP, v.vertex);
    39.                 o.uv = v.uv;
    40.                 return o;
    41.             }
    42.  
    43.             fixed4 _penColor;
    44.             float _penX;
    45.             float _penY;
    46.  
    47.             fixed4 frag (v2f i) : SV_Target
    48.             {
    49.                 float x = i.uv.x;
    50.                 float y = i.uv.y;
    51.  
    52.                 float penSize = 0.01f;
    53.                 float halfPen = penSize / 2;
    54.  
    55.                 if (x > (_penX - halfPen) && x < _penX + halfPen && y >(_penY - halfPen) && y < (_penY + halfPen)) {
    56.                     return _penColor;
    57.                 }
    58.  
    59.                 return fixed4(_penColor.rgb, 0.0);
    60.             }
    61.             ENDCG
    62.         }
    63.     }
    64. }
    65.  
    66.  
    Note this shader doesn't even care if _MainTex exists, it's completely ignored.
     
    henners999 likes this.
  16. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,343
    Going back to another issue from your original post, the dotted line problem. I assume you're sampling the "pen" position on update. For desktop VR this is usually 90hz, so if you move your hand more than the radius of the pen "point" within ~11 ms you'll see the gaps. You can work around that either by sampling using FixedUpdate and setting your fixed update rate to something ridiculously high, or by detecting when the movement of the pen from the previous frame is large enough to show the gaps and draw more points along the line between. You could even write a shader that took two points and draws a straight line between them.
     
  17. ciwolsey

    ciwolsey

    Joined:
    Jun 13, 2016
    Posts:
    29
    Yeah eventually I fixed the issue with linear interpolation between frames but I'm now wanting to improve the frame rates. I applied shader from your post to a plane, and I can see a white square on it but none of the cameras seem to be able to see it and it still doesn't leave a trail when I change X/Y.

    I'm guessing this is supposed to be wired up to a rendertexture somehow but I can't figure it out.
     
    Last edited: Jun 18, 2016
  18. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,343
    That shader shouldn't be used on a plane, it should be used with a blit command.
     
  19. ciwolsey

    ciwolsey

    Joined:
    Jun 13, 2016
    Posts:
    29
    Graphics.Blit takes a texture and a rendertexture, I'm not sure how I would blit the output of this shader anywhere. The shader isn't using a texture or a rendertexture right now so how do I blit the output somewhere?
     
  20. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,343
    One form of blit takes a texture and a render texture. Another form takes a texture, a render texture, and a material. That's the one you need to use; make a material using that shader and use that with blit along with your render texture.
     
  21. ciwolsey

    ciwolsey

    Joined:
    Jun 13, 2016
    Posts:
    29
    And what about the texture param? I'm using rendertexture for the rendertexture, the material with the shader you gave me on for the material, but what do i set the texture param to?
     
  22. ciwolsey

    ciwolsey

    Joined:
    Jun 13, 2016
    Posts:
    29
    Like this?

    Code (CSharp):
    1. using UnityEngine;
    2. using System.Collections;
    3.  
    4. public class blit : MonoBehaviour {
    5.     public RenderTexture rt;  // I'm setting this through UI
    6.  
    7.     // Use this for initialization
    8.     void Start () {
    9.         Graphics.Blit(Texture2D.whiteTexture, rt, gameObject.GetComponent<Renderer>().material);
    10.     }
    11.    
    12.     // Update is called once per frame
    13.     void Update () {
    14.        
    15.     }
    16. }
     
  23. ciwolsey

    ciwolsey

    Joined:
    Jun 13, 2016
    Posts:
    29
    I'm sure you'll be glad to know I finally did it. I will leave you in peace, thank you very much bgolus, I've been at this almost a week :)
     
  24. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,343
    For people following along:

    The material shouldnt be the one used by the renderer component, it should be a entirely different material used only by blit. The only connection between your game view drawing board mesh material and this stuff is the render texture itself.

    And the Blit function should be called as part of update. It's effectively the replacement for both SetPixels and Apply, so you'd be setting the parameters on the material then calling Blit. In Start you could create your material, or you could assign it from the UI like you have the render texture.

    In your original setup I assume you had a setup like this:
    A script that tracks the motion controller and tests if it's close to the virtual surface.
    If it is it calculates the relative position, and updates a texture using calls to SetPixels and Apply.
    A "drawing board" object with a material using the standard shader has that same texture applied as its albedo map.

    With using Blit everything above is the same, except instead of "updates a texture using calls to SetPixels and Apply." it's "updates a render texture using Blit." And that render texture is applied to the standard shader as its albedo.
     
    Sluggy, JamesThornton and taguados like this.
  25. ciwolsey

    ciwolsey

    Joined:
    Jun 13, 2016
    Posts:
    29
    Yeah I was doing it right, what fixed it was using sharedMaterials
     
  26. LuckyStreak

    LuckyStreak

    Joined:
    Apr 1, 2015
    Posts:
    9
    Hello, I was following along with the discussion. It wasn't clear if going through all this, you were actually able to make the drawing work faster. If you were, which part of your code made it so the writing quickly didn't leave a dotted line. That's currently my issue.

    Do you have a sample project, code, or tutorial for this, if you did get it working as you originally were asking.
     
  27. clarencedadson

    clarencedadson

    Joined:
    Mar 23, 2013
    Posts:
    7
    Hi i know this topic is kind of old, but i am working on a similar project at the moment and cant get it to work.
    I am trying to make a collaborative whiteboard in VR. The whiteboard worked when i did it with just Texture2D for Singleplayer. When i tried to Synchronize it with Photon, it didnt work, so i thought synchronizing this texture2D is the big problem.

    So i found this thread with another implementation of a whiteboard and it sounds like it would be easier to synchronize the render texture then. Just let both parties draw on the same render texture or something like that^^

    But first of all i need to get this implementation working!

    This is my Whiteboard Class currently:
    mat is my Material with the Shader you posted and rt is the render Texture. Both are set in the UI.
    Code (CSharp):
    1.  
    2.  
    3. public class Whiteboard : NetworkBehaviour
    4. {
    5.     public RenderTexture rt;
    6.     public Material mat;
    7.     private int textureSize = 2048;
    8.     private int penSize = 10;
    9.     //private Texture2D texture;
    10.     private Color[] color;
    11.     //Renderer rend;
    12.  
    13.     private bool touching, touchingLast;
    14.     private float posX, posY;
    15.     private float lastX, lastY;
    16.  
    17.     // Use this for initialization
    18.     void Start()
    19.     {
    20.         //rend = GetComponent<Renderer>();
    21.        // texture = new Texture2D(textureSize, textureSize);
    22.        // rend.material.mainTexture = texture;
    23.         photonViewComp = GetComponent<PhotonView>();
    24.  
    25.         GetComponent<Renderer>().material.SetTexture("_MainTex", rt);
    26.     }
    27.  
    28.     // Update is called once per frame
    29.     void Update()
    30.     {
    31.  
    32.         //Paint();
    33.  
    34.         if (PhotonNetwork.connected)
    35.         {
    36.             photonViewComp.RPC("Paint", PhotonTargets.All);
    37.         }
    38.     }
    39.     public void ToggleTouch(bool touching)
    40.     {
    41.         this.touching = touching;
    42.     }
    43.     public void SetTouchPosition(float x, float y) // SET BY THE PEN CASTING  A RAYCAST TO THE WHITEBOARD
    44.     {
    45.         this.posX = x;
    46.         this.posY = y;
    47.     }
    48.     public void SetColor(Color color)
    49.     {
    50.         this.color = Enumerable.Repeat<Color>(color, penSize * penSize).ToArray<Color>();
    51.     }
    52.     [PunRPC]
    53.     void Paint()
    54.     {
    55.         float x = (float)(posX * textureSize - (penSize / 2));
    56.         float y = (float)(posY * textureSize - (penSize / 2));
    57.  
    58.         if (touchingLast)
    59.         {
    60.             //texture.SetPixels(x, y, penSize, penSize, color);
    61.             foreach (Color c in color)
    62.             {
    63.  
    64.                 mat.SetColor("_penColor", c);
    65.  
    66.             }
    67.             mat.SetFloat("_penX", x);  // HERE I SET THE FLOATS OF THE SHADER...But they are not between 0 and 1 currently, could this be a problem? I dont understand Shaders just yet sorry.
    68.  
    69.             mat.SetFloat("_penY", y);
    70.             for (float t = 0.01f; t < 1.00f; t += 0.01f)
    71.             {
    72.                 int lerpX = (int)Mathf.Lerp(lastX, (int)x, t);
    73.                 int lerpY = (int)Mathf.Lerp(lastY, (int)y, t);
    74.  
    75.                 foreach (Color c in color)
    76.                 {
    77.  
    78.                     mat.SetColor("_penColor", c);
    79.  
    80.                 }
    81.                 mat.SetFloat("_penX", lerpX);
    82.                 mat.SetFloat("_penY", lerpY);
    83.                 //texture.SetPixels(lerpX, lerpY, penSize, penSize, color);
    84.             }
    85.  
    86.             //texture.Apply();
    87.             Graphics.Blit(Texture2D.whiteTexture, rt,mat);
    88.         }
    89.  
    90.         this.lastX = (float)x;
    91.         this.lastY = (float)y;
    92.         this.touchingLast = this.touching;
    93.     }
    94.  
    95.  
    Would be great if you can help me out! Thank you!
     
  28. Marazyt

    Marazyt

    Joined:
    Jan 23, 2014
    Posts:
    3
    So close to understanding! What is sharedMaterials? How did this fix it?
     
  29. Marazyt

    Marazyt

    Joined:
    Jan 23, 2014
    Posts:
    3
    clarencedadson Did you figure this out?
     
  30. diego_ger

    diego_ger

    Joined:
    Dec 17, 2017
    Posts:
    8
    so close, yet so far. Has anyone had success with this?

    As far as I can get of that shader, it changes only the color in the spot of the pencil and replace the rest of the texture with another color. If you move the pencil (X and Y) the spot change its position but it doesn't keep "painting" the texture.
    In order to work I think it needs a way to use the last frame render texture as the new surce render texture of the Blit in the next frame. Anyone knows how to do that "flip" of the source render texture in each frame in update()?
     
    Last edited: Sep 13, 2018
  31. nareshbishtasus

    nareshbishtasus

    Joined:
    Jun 11, 2018
    Posts:
    36
    If your problem is that you are not getting a continuous line but dotted line i think you should use some kind of interpolation function which check for mouse last and current position and fill the gap in between.
     
  32. Archi_16

    Archi_16

    Joined:
    Apr 7, 2017
    Posts:
    87
    How you did this. i want to create something similar. is it only shader?
     
  33. helios

    helios

    Joined:
    Oct 5, 2009
    Posts:
    308
    I know this is an extremely old thread, but it is the most relevant to what I'm trying to achieve. I essentially was already attempting a similar "drawing" tool which I was originally using SetPixels to copy a brush texture to the target texture. This actually is pretty slow when moving quickly, so I attempted to Blit before reading this thread. My setup is more or less the same, however I'm using a UI Canvas RawImage with it's texture pointing to my RenderTexture.

    I have a brush (which has some texture), that I'm trying to Blit() onto the RenderTexture. What I'm not really understanding, is how to properly position the source texture onto the RenderTexture. My pointer coordinates are the same as the RenderTexture, however its greatly lagging behind where it should be. Additionally, I'm having the same problem as the original poster where the RenderTexture is just overriding the previous brush stroke. I'm passing in a new material that uses the URP/Unlit shader to the Blit() function, but this also raises another question in how do I tell it where to position the source texture onto the destination since there are no function overloads that allow me to pass position along with the material.

    Any help is greatly appreciated.
     
    henners999 likes this.