Search Unity

  1. Good news ✨ We have more Unite Now videos available for you to watch on-demand! Come check them out and ask our experts any questions!
    Dismiss Notice

Custom Mesh Rendering under UI Canvas

Discussion in 'UGUI & TextMesh Pro' started by etoreo, Aug 31, 2014.

  1. etoreo

    etoreo

    Joined:
    Apr 11, 2012
    Posts:
    104
    I am trying to render a custom mesh in the new GUI system. I have a UI Canvas and then a small hierarchy of game objects under them is a game object with a custom mesh. I can't seem to get it to render...

    I have made sure that my hierarchy are all using the "UI" layer and tried to add Canvas Renders at each level, including the custom mesh.

    Any advice?
     
  2. Tim-C

    Tim-C

    Unity Technologies

    Joined:
    Feb 6, 2010
    Posts:
    2,160
    HI, you can add a canvas renderer and manually set the vertices. We will be adding support for making any 'renderer' work with the UI in a future release.
     
    shrinath-kopare likes this.
  3. etoreo

    etoreo

    Joined:
    Apr 11, 2012
    Posts:
    104
    Thanks Tim,
    I have been doing that and its been working great.
     
  4. kennyd-itp

    kennyd-itp

    Joined:
    Jan 13, 2014
    Posts:
    30
    Hey etoreo, would it be possible to post a small code example of how you're doing this ("manually set the vertices")? I'm not quite sure what Tim means. Thanks!
     
  5. etoreo

    etoreo

    Joined:
    Apr 11, 2012
    Posts:
    104
    Here is some code I am using in my project (pGo is the Game Object somewhere under a canvas):
     
  6. kennyd-itp

    kennyd-itp

    Joined:
    Jan 13, 2014
    Posts:
    30
    Thanks!
     
  7. Wahooney

    Wahooney

    Joined:
    Mar 8, 2010
    Posts:
    275
    @Tim C Is there a way to pass triangles through to the canvas renderer?

    Because CanvasRenderer seems to want quads, and any imported mesh data that comes through is going to be in triangle form.

    I know we can pass each triangle as [0] [0] [1] [2] but that is wasting a vertex per triangle?

    Thanks!
     
  8. etoreo

    etoreo

    Joined:
    Apr 11, 2012
    Posts:
    104
    @Wahooney It is my understanding that the underlying code optimizes the mesh and will eliminate that extra vertex.
     
  9. Willkuerlich

    Willkuerlich

    Joined:
    Sep 23, 2014
    Posts:
    13
    @etoreo in your script you do: pCanvasRenderer.SetVertices(pVertexList); what does your pVertexList look like? And where are you setting the uvs? Thanks!
     
  10. phil-Unity

    phil-Unity

    Unity UI Lead Developer Unity Technologies

    Joined:
    Nov 23, 2012
    Posts:
    1,177
    No currently we dont support triangles and we always assume quads. in the future we plan on adding support for this.
     
  11. BrainAndBrain

    BrainAndBrain

    Joined:
    Nov 27, 2014
    Posts:
    102
    Bump @Willkuerlich's question, I'm wondering the same thing! Not sure how to get a UIVertex array from a mesh.
     
  12. mtalbott

    mtalbott

    Joined:
    Dec 21, 2011
    Posts:
    123
    @phil-Unity & @Tim C Thanks for sharing some insight into Unity's plans. I too would love to have the CanvasRender accept triangles or to just use a MeshRenderer is another way to do it.

    Maybe you could answer a question I have: Why does "CanvasRenderer" appear to inherits from "Component" and not "Renderer"? I was working on a custom UI Graphic type and wanted to set (2) textures for (2) [PreRenderData] properties in my custom UI shader. I was surprised to find that I could not use SetPropertyBlock and that the SetMateral(Material, Texture) will only change the _MainTex.

    @BrainAndBrain & @Willkuerlich:
    Code (CSharp):
    1. UIVertex vert;
    2. Mesh mesh;
    3. List<UIVertex> pVertexList = new List<UIVertex>();
    4. int[] indices = mesh.GetIndices();
    5. for (int i=0; i<indices.Length; i+=3) {
    6.     for (int p=0; p<3; p++) {
    7.         vert = new UIVertex();
    8.         vert.position = mesh.vertices[indices[i+p]];
    9.         vert.normal = mesh.normals[indices[i+p]];
    10.         vert.uv0 = mesh.uv0[indices[i+p]];
    11.         //do the same for tangent, uv1, and color if you need to.
    12.         pVertexList.Add(vert);
    13.     }
    14.     //This just adds the last vertex twice to fit the quad format.
    15.     pVertexList.Add(vert);
    16. }
     
  13. Liszt

    Liszt

    Joined:
    Jan 12, 2015
    Posts:
    29
    @phil-Unity & @Tim C I work on a custom system too and this is quite painfull to not inherits from Renderer.
    My system was using the enabled/disabled system from renderer and with the CanvasRenderer it's impossible ?

    Some people says SetAlpha to 0 with a CanvasGroup but it's not really a good solution cause the mesh are always rendered even if the user can't see them.
    Others people says do a SetActive to false but it's not a viable solution for me too cause if the object has a script that need to be used for some reason the Update state will not be displayed ( except if i'm wrong on this part ? )

    Why this "CanvasRenderer" that you called as Renderer doesn't follow the Renderer functionalities ?
     
  14. phil-Unity

    phil-Unity

    Unity UI Lead Developer Unity Technologies

    Joined:
    Nov 23, 2012
    Posts:
    1,177
    its not derived as the CanvasRenderer's are all batched together and then we create a meshRenderer from that. It itself is not renderable.
     
  15. mtalbott

    mtalbott

    Joined:
    Dec 21, 2011
    Posts:
    123
    @phil-Unity ah ha! that would explain it. Thanks for the info.

    Although, maybe it shouldn't be called a "Renderer" then. Maybe "CanvasBatcher" or something. Probably too late for that; that ship has sailed. Alternatively, Unity should work to pass-through or emulate as much functionally from the mother MeshRenderer to the CanvasRenderer. ie. Setting triangle indices or just accept a mesh, enabled/disabled individual graphics, Setting Property Blocks beyond just the MainTex, etc.

    Also, just a quick side note. UIVertex doesn't seem to be set as serializable. That might work for you all but it would be great if I could save and reuse my List<UIVertex>.
     
  16. Liszt

    Liszt

    Joined:
    Jan 12, 2015
    Posts:
    29
    Thanks for your answer @phil-Unity ! That explain a lot of things :)

    Oh so just one more thing :
    Use a CanvasGroup to hide all the elements with alpha would change nothing cause all the UI is render as one mesh ?

    But use a SetActive force your batching to regenerate one mesh ? Or did you just hide his visual and let him in the global MeshRenderer ?
     
  17. mtalbott

    mtalbott

    Joined:
    Dec 21, 2011
    Posts:
    123
    @Liszt I just saw in the 4.6.3 release notes:

    "UI: Optimized adding/removing Graphics to a Canvas. It should be much cheaper to enable and disable Graphics now."

    Maybe that will help with your issues.
     
  18. etoreo

    etoreo

    Joined:
    Apr 11, 2012
    Posts:
    104
    @Willkuerlich
    Sorry it has taken so long to respond. My vertext list looks like this... but I should note that something about my code is not working in 4.6.3. Trying to work that out now.

    Rect pDrawArea = DrawArea;
    Vector3 a = new Vector3(pDrawArea.xMin, pDrawArea.yMax, 0);
    Vector3 b = new Vector3(pDrawArea.xMax, pDrawArea.yMax, 0);
    Vector3 c = new Vector3(pDrawArea.xMin, pDrawArea.yMin, 0);
    Vector3 d = new Vector3(pDrawArea.xMax, pDrawArea.yMin, 0);

    UIVertex pUIVertex = new UIVertex();

    pUIVertex.position = c;
    pUIVertex.uv0 = new Vector2(1, 1);
    pList.Add(pUIVertex);

    pUIVertex.position = a;
    pUIVertex.uv0 = new Vector2(0, 0);
    pList.Add(pUIVertex);

    pUIVertex.position = b;
    pUIVertex.uv0 = new Vector2(0, 1);
    pList.Add(pUIVertex);


    pUIVertex.position = d;
    pUIVertex.uv0 = new Vector2(1, 0);
    pList.Add(pUIVertex);
     
  19. Billy4184

    Billy4184

    Joined:
    Jul 7, 2014
    Posts:
    5,397
    @etoreo thanks for posting your code. I can render the mesh but it is all black no matter what material I use. Any ideas what might be causing this? Thanks!

    EDIT: never mind, forgot to set the normals
     
    Last edited: Jul 12, 2015
  20. CatPhone

    CatPhone

    Joined:
    Jun 11, 2013
    Posts:
    1
    Glad I could help ;)
     
  21. RazorCut

    RazorCut

    Joined:
    May 7, 2009
    Posts:
    394
    I'm also working on setting up custom vertices for a CanvasRenderer. I've got scripts to define the vertices manually and also one that I can attach a mesh to use. However, I'm not certain that either is actually working.

    1. I don't see any visual change in my UI with or without using either of my scripts (even if intentionally try to do whonky stuff. It's almost as if the canvasRenderer.SetVertices() call is doing nothing.

    The manual script:

    Code (CSharp):
    1.  
    2. public void SetRegionVertices()
    3.     {
    4.         RectTransform rectTransform = GetComponent<RectTransform> ();
    5.         CanvasRenderer canvasRenderer = GetComponent<CanvasRenderer> ();
    6.         Rect drawArea = rectTransform.rect;
    7.         vertices.Clear();
    8.  
    9.         UIVertex vertex;
    10.  
    11.         Debug.Log(string.Format("xMin={0}, xMax={1}, yMin={2}, yMax={3}, w={4}, h={5}", drawArea.xMin, drawArea.xMax, drawArea.yMin, drawArea.yMax, drawArea.width, drawArea.height));
    12.  
    13.         // default 4-point quad (bottom left -> top left -> top right -> bottom right)
    14.  
    15.         for (int x=0; x<numQuadsX; x++)
    16.         {
    17.             for (int y=0; y<numQuadsY; y++)
    18.             {
    19.                 float xMin = drawArea.xMin + x*(drawArea.width/numQuadsX);
    20.                 float xMax = drawArea.xMin + (x+1)*(drawArea.width/numQuadsX);
    21.                 float yMin = drawArea.yMin + y*(drawArea.height/numQuadsY);
    22.                 float yMax = drawArea.yMin + (y+1)*(drawArea.height/numQuadsY);
    23.                 //Debug.Log(string.Format("quad ({0}, {1}) xMin={2}, xMax={3}, yMin={4}, yMax={5}", x, y, xMin, xMax, yMin, yMax));
    24.  
    25.                 vertex = new UIVertex();
    26.                 vertex.position = new Vector3(xMin, yMin);
    27.                 vertices.Add(vertex);
    28.  
    29.                 vertex = new UIVertex();
    30.                 vertex.position = new Vector3(xMin, yMax);
    31.                 vertices.Add(vertex);
    32.  
    33.                 vertex = new UIVertex();
    34.                 vertex.position = new Vector3(xMax, yMax);
    35.                 vertices.Add(vertex);
    36.  
    37.                 vertex = new UIVertex();
    38.                 vertex.position = new Vector3(xMax, yMin);
    39.                 vertices.Add(vertex);
    40.             }
    41.         }
    42.  
    43.         canvasRenderer.Clear();
    44.         canvasRenderer.SetVertices(vertices);
    45.     }
    46.  
    Does this look right? numQuadsX and numQuadsY let me specify the segmentation. The default values are 1; the debug output to the console doesn't seem out of whack.

    The mesh script:

    Code (CSharp):
    1.    
    2. public void SetRegionVertices()
    3.     {
    4.         RectTransform rectTransform = GetComponent<RectTransform> ();
    5.         CanvasRenderer canvasRenderer = GetComponent<CanvasRenderer> ();
    6.         Rect drawArea = rectTransform.rect;
    7.         vertices.Clear();
    8.      
    9.         UIVertex vertex;
    10.      
    11.         Debug.Log(string.Format("xMin={0}, xMax={1}, yMin={2}, yMax={3}, w={4}, h={5}", drawArea.xMin, drawArea.xMax, drawArea.yMin, drawArea.yMax, drawArea.width, drawArea.height));
    12.      
    13.         int[] indices = mesh.GetIndices(0);
    14.         Debug.Log(string.Format("# indices: {0}", indices.Length));
    15.         for (int i=0; i<indices.Length; i+=3)
    16.         {
    17.             vertex = new UIVertex();
    18.             for (int p=0; p<3; p++)
    19.             {
    20.                 vertex = new UIVertex();
    21.                 vertex.position = mesh.vertices[indices[i+p]];
    22.                 vertex.normal = mesh.normals[indices[i+p]];
    23.                 vertex.uv0 = mesh.uv[indices[i+p]];
    24.                 //do the same for tangent, uv1, and color if necessary.
    25.                 vertices.Add(vertex);
    26.             }
    27.             //This just adds the last vertex twice to fit the quad format.
    28.             vertices.Add(vertex);
    29.         }
    30.         Debug.Log(string.Format("# vertices: {0}", vertices.Count));
    31.  
    32.         canvasRenderer.Clear();
    33.         canvasRenderer.SetVertices(vertices);
    34.     }
    35.  
    Does this look right?

    2. I am using the "wireframe" shading mode in the scene to see if the wireframe changes for the UI. I see absolutely no change. Should the scene view be accurately reflecting the wireframes even for UI elements?
     
    Last edited: Jul 18, 2015
    kromenak likes this.
  22. Billy4184

    Billy4184

    Joined:
    Jul 7, 2014
    Posts:
    5,397
    Your code seems right, have you set the material on the canvasrenderer (CanvasRenderer.SetMaterial)? Also I noticed many shaders don't seem to work, including the standard shader, although I didn't try very much. Try using a Toon/Lit Outline which is what I got mine running with and let us know if it works.
     
    Last edited: Jul 18, 2015
  23. RazorCut

    RazorCut

    Joined:
    May 7, 2009
    Posts:
    394
    No I wasn't setting a material. I'm toying around with that now, thanks! But so far, absolutely nothing is making a difference, not even materials with any of the standard UI shaders nor the Toon/Lit Outline shader (though I get warnings about that regarding not being able to use a sub shader).

    I'll post something more definitive shortly, but it's not looking promising so far. :(
     
    Last edited: Jul 18, 2015
  24. Billy4184

    Billy4184

    Joined:
    Jul 7, 2014
    Posts:
    5,397
    Here's the script I use to render my mesh. The gameobject contains a mesh filter with the mesh assigned so that it can be easily accessed by the script, and the material is Toon/Lit Outline.

    By the way, is your Canvas in world space or screen space? If it is in screen space you might have some problems rendering 3d stuff.

    Hope this helps.



    Code (CSharp):
    1. public class RenderMeshOnCanvas2 : MonoBehaviour {
    2.  
    3.     public Material material;
    4.  
    5.     // Use this for initialization
    6.     void Start () {
    7.  
    8.         Mesh mesh = GetComponent<MeshFilter>().sharedMesh;
    9.        
    10.         CanvasRenderer canvasRenderer = GetComponent<CanvasRenderer>();  
    11.        
    12.         Vector3[] vertices = mesh.vertices;
    13.         int[] triangles = mesh.triangles;
    14.         Vector3[] normals = mesh.normals;
    15.        
    16.         Vector2[] UVs = mesh.uv;
    17.        
    18.         List<UIVertex> uiVertices = new List<UIVertex>();
    19.        
    20.         for (int i = 0; i < triangles.Length; ++i){
    21.             UIVertex temp = new UIVertex();
    22.             temp.position = vertices[triangles[i]];
    23.             temp.uv0 = UVs[triangles[i]];  
    24.             temp.normal = normals[triangles[i]];
    25.             uiVertices.Add (temp);
    26.             if (i%3 == 0)
    27.                 uiVertices.Add (temp);
    28.         }
    29.        
    30.         canvasRenderer.Clear ();      
    31.         canvasRenderer.SetMaterial(material, null);
    32.         canvasRenderer.SetVertices(uiVertices);
    33.     }
    34. }
    35.  
     
    icyaway likes this.
  25. RazorCut

    RazorCut

    Joined:
    May 7, 2009
    Posts:
    394
    Thanks for sharing that!

    I put my code in an Update() loop and it's starting to work (though I have a UV issue to work out I think because it's invisible now). I was calling the function just one time in the Start() handler (i.e. like yours); odd that having it called in an update loop works -- but that's going to be expensive if I have to keep it like that. [Correction: setting the material and vertices to precomputed values every frame won't be expensive, so I can live with that.]

    Oh, and I definitely needed a material.
     
    Last edited: Jul 18, 2015
  26. Billy4184

    Billy4184

    Joined:
    Jul 7, 2014
    Posts:
    5,397
    Glad to hear you got something going, but what do you mean by 'starting to work'?

    In any case, if something works in Update() but not in Start(), it suggests that either 1) something that is needed by the script is not available at the start or 2) more stuff is being added to your mesh each update, meaning you'll probably overflow something and crash eventually.

    It would be helpful if you posted your original Start() code showing how you set up your data and in what context you called the SetRegionVertices method.
     
  27. MrDude

    MrDude

    Joined:
    Sep 21, 2006
    Posts:
    2,569
    Just a quick question, please... I've been working on a 2D project where I have to overlay 3D stuff on top of Unity UI stuff and Unity UI stuff on top of the 3D stuff also so basically, the 3D stuff had to be nested between the Unity UI layers.

    I must admit that most of what you guys are speaking about above is well above my head but from what I can see you are adding a CanvasRenderer component to each of your 3D models then doing .... something... with the meshes and basically then copying the rendered graphic into the material that is then drawn using the Unity UI

    My approach is quite different indeed. I simply created a bunch of game objects, added a Canvas to each one, assigned a different sorting layer to each canvas and then I simply set the sorting layer on my 3D objects to match one of the canvasses in the middle. Now I compose my scene in world space with my 2D and 3D objects where they need to be and voila, everything neatly working as expected... and by re-ording the sorting layers I can adjust what goes in front of what etc...

    So I am now wondering... my approach for sorting gameobjects within a specific layer would be to adjust the z value instead of adjusting the sibling index but apart from that I would say I am getting the same results with simply setting the sorting layer on my 3D objects and creating a Canvas for each sorting layer I have...

    So my question is simply this:
    Are there any significant benefits to be had by one approach over the other? More so, enough to warrant a change from one to the other?

    I resorted to this method because I didn't know about this method of yours but my method works just dandy so is there any substantial benefit to be gained from your method that might want to make me change over from my method to yours?

    Thanks
     
  28. ReneKok

    ReneKok

    Joined:
    Mar 6, 2015
    Posts:
    2
    Here's a whole script based on the previous examples in the thread, which works in Unity 5+. I figure I add it to the thread as I felt it might help someone else.

    My main pitfall was the scale as it was rendering a mesh that had vertices in the range of 0 to 3 which rendered so small I couldn't see it!

    I left in some code in the #if blocks that can be removed, but it helps to see the changes in the editor straight away.

    Code (CSharp):
    1. using System.Collections.Generic;
    2. using UnityEngine;
    3.  
    4. [ExecuteInEditMode, RequireComponent(typeof(CanvasRenderer))]
    5. public class UIMeshRenderer : MonoBehaviour
    6. {
    7.     public Mesh mesh;
    8.     public Material material;
    9.     public float scale = 1f;
    10.  
    11.     private CanvasRenderer canvasRenderer;
    12.  
    13. #if UNITY_EDITOR // only compile in editor
    14.     private Mesh currentMesh;
    15.     private Material currentMaterial;
    16.     private float currentScale;
    17. #endif
    18.  
    19.     public void Awake()
    20.     {
    21.         canvasRenderer = GetComponent<CanvasRenderer>();
    22.     }
    23.  
    24.     public void OnEnable()
    25.     {
    26.         SetMesh();
    27.     }
    28.  
    29.     public void OnDisable()
    30.     {
    31.         canvasRenderer.Clear();
    32.     }
    33.  
    34. #if UNITY_EDITOR // only compile in editor
    35.     public void Update()
    36.     {
    37.         if (mesh != currentMesh || material != currentMaterial || !Mathf.Approximately(scale, currentScale))
    38.         {
    39.             SetMesh();
    40.         }
    41.     }
    42. #endif
    43.  
    44.     public void SetMesh()
    45.     {
    46.         // clear the canvas renderer every time
    47.         canvasRenderer.Clear();
    48.  
    49. #if UNITY_EDITOR // only compile in editor
    50.         currentMesh = mesh;
    51.         currentMaterial = material;
    52.         currentScale = scale;
    53. #endif
    54.  
    55.         if (mesh == null)
    56.         {
    57.             Debug.LogWarning("Mesh is null.");
    58.             return;
    59.         }
    60.         else if (material == null)
    61.         {
    62.             Debug.LogWarning("Material is null.");
    63.             return;
    64.         }
    65.  
    66.         List<UIVertex> list = ConvertMesh();
    67.  
    68.         canvasRenderer.SetMaterial(material, null);
    69.         canvasRenderer.SetVertices(list);
    70.     }
    71.  
    72.     public List<UIVertex> ConvertMesh()
    73.     {
    74.         Vector3[] vertices = mesh.vertices;
    75.         int[] triangles = mesh.triangles;
    76.         Vector3[] normals = mesh.normals;
    77.         Vector2[] uv = mesh.uv;
    78.  
    79.         List<UIVertex> vertexList = new List<UIVertex>(triangles.Length);
    80.  
    81.         UIVertex vertex;
    82.         for (int i = 0; i < triangles.Length; i++)
    83.         {
    84.             vertex = new UIVertex();
    85.             int triangle = triangles[i];
    86.  
    87.             vertex.position = ((vertices[triangle] - mesh.bounds.center) * scale);
    88.             vertex.uv0 = uv[triangle];
    89.             vertex.normal = normals[triangle];
    90.  
    91.             vertexList.Add(vertex);
    92.  
    93.             if (i % 3 == 0)
    94.                 vertexList.Add(vertex);
    95.         }
    96.  
    97.         return vertexList;
    98.     }
    99. }
     
    Last edited: Sep 3, 2015
    MrDude likes this.
  29. warance

    warance

    Joined:
    May 4, 2014
    Posts:
    11
    Whats the behavior of canvasRenderer when you supply a non quad vertex to it? I am trying to render thick lines on the canvas. Previously I just create a mesh with just the start and end vertex of the line(s) and let geometry shader generate the quad for a thick line. But now it seems that this method cant work anymore since I have to pass in at least 4 vertexs(a quad ) to get the canvasRenderer working?
     
  30. ReneKok

    ReneKok

    Joined:
    Mar 6, 2015
    Posts:
    2
    You can only supply quads to it, if you don't you'll get log errors and some graphical corruption. So you have to convert it to from a triangle to a quad before you can pass it to the canvas renderer. You can convert them easy enough, see the couple of examples above.
     
  31. mwk888

    mwk888

    Joined:
    Jan 26, 2014
    Posts:
    16
    @ReneKok are you using Screen Space - Overlay or Screen Space - Camera? I have not been able to get anything to draw via quads through CanvasRenderer using Unity 5.1 in Screen Space - Overlay mode. Or maybe I haven't tried the right Material -- anyone know a material that actually draws with Canvas Renderer and -Overlay?
     
  32. Billy4184

    Billy4184

    Joined:
    Jul 7, 2014
    Posts:
    5,397
    AFAIK it has to be world space if you want it to render as a 3d object, overlay is only for 2-dimensional stuff, in which case it would prob be better to use an image. If you have stuff you want to render in Overlay mode I suggest using two canvases in different render modes, that's what I did.
     
    mwk888 likes this.
  33. magnusfox

    magnusfox

    Joined:
    Nov 7, 2014
    Posts:
    18
    We tried this, and it worked like a charm in Unity up until at least Unity 5.3.0f4. However, updating to the latest 5.3.4p4 it suddenly stopped working completely. Would you by any chance have any idea on why this could have changed? I've tried just about anything I can think of, but no luck so far. Any help much appreciated!
     
  34. SimonDarksideJ

    SimonDarksideJ

    Joined:
    Jul 3, 2012
    Posts:
    1,650
    Is this still the case? in the UI Extensions project we use custom line rendering by using the AddVertex routines and draw to a RectTransform on all canvases.

    Although, rendering meshes doesn't seem to work (will need to try the above), @phil-Unity and @Tim C indicated it would need a small shader change for it to actually render.
     
  35. Aohajin

    Aohajin

    Joined:
    Sep 18, 2014
    Posts:
    10
    Now CanvasRenderer.SetVertices is marked as obsolete...
    Plus I found a lot gc alloc generated from its internal List<> manipulation
    According to the Documentation, we should use CanvasRenderer.SetMesh now.
    Unfortunately I can't find any documentation or guideline about how, and I try it myself without any success:(

    Need some help here.
     
  36. Dan-MacDonald

    Dan-MacDonald

    Joined:
    Oct 25, 2013
    Posts:
    17
    I'm also trying to use CanvasRenderer.SetMesh I've created a new Mesh() programmatically and set it's verts/tris/uv's. I've also set a material on the canvasRenderer however nothing shows up in the SceneView when my customized CanvasRenderer is present.
     
  37. Billy4184

    Billy4184

    Joined:
    Jul 7, 2014
    Posts:
    5,397
    Try the toon outline shader from standard assets. I haven't used CanvasRenderer.SetMesh yet though because SetVertices still works in 5.3.3.
     
  38. Dan-MacDonald

    Dan-MacDonald

    Joined:
    Oct 25, 2013
    Posts:
    17
    The following worked for me :)



    Code (CSharp):
    1. public class CanvasMesh : MonoBehaviour
    2. {
    3.     public CanvasRenderer CanvasRenderer;
    4.     public Material Material;
    5.     private Mesh mesh;
    6.     public Vector3[] newVertices;
    7.     public Vector2[] newUV;
    8.     public int[] newTriangles;
    9.  
    10.     // Use this for initialization
    11.     void Start()
    12.     {
    13.         this.mesh = new Mesh();
    14.         this.CanvasRenderer.SetMaterial(this.Material,null);
    15.     }
    16.  
    17.     // Update is called once per frame
    18.     void Update()
    19.     {
    20.         this.mesh.Clear();
    21.         this.mesh.vertices = newVertices;
    22.         this.mesh.uv = newUV;
    23.         this.mesh.triangles = newTriangles;
    24.         this.CanvasRenderer.SetMesh(this.mesh);
    25.     }
    26. }
    The material just uses the default UI shader...
     
    Last edited: Jul 20, 2016
    sonofbryce likes this.
  39. HugoStudica

    HugoStudica

    Joined:
    Aug 25, 2016
    Posts:
    5

    Dan-MacDonald, is there any way to apply anti-aliasing to this only?
     
  40. Dan-MacDonald

    Dan-MacDonald

    Joined:
    Oct 25, 2013
    Posts:
    17
    It depends on how you want to use it. Anti Aliasing is an image effect meaning that you need to know the pixels around what you are rendering not just the pixels you are rendering. This could be done with a custom shader with a "Grab Pass" that performs Anti Aliasing. That's a little beyond my shader-fu so that's about all I can help with there.

    If your canvas mesh is always drawn over top of your scene you could draw it with a separate camera whose clear flags are set to overlay and apply anti aliasing image effects to that camer.

    A quick and dirty way would be to create a texture that has a triangle this shape, make the background transparent and use photoshop (or some art tool) to give it anti aliased edges. Then use the UVs to map that texture to your canvas mesh. When it draws it will draw the antialiased triangle from the texture :)
     
  41. singhh90

    singhh90

    Joined:
    Jun 1, 2017
    Posts:
    1
    Does this mean that I cannot update the mesh of CanvasRenderer natively i.e. by getting a native buffer pointer of mesh using mesh.GetNativeVertexBufferPtr(0) and updating vbo data using OpenGL?

    Because I am trying to update CanvasRenderer's mesh natively from my plugin, and it doesn't seem to work. But when I try the same thing with MeshRenderer it works.

    I am following approach suggested by Unity to update mesh from native code-
    https://docs.unity3d.com/Manual/NativePluginInterface.html
    https://bitbucket.org/Unity-Technol...3011994ae74/NativeRenderingPlugin/?at=default
     
  42. MVS_

    MVS_

    Joined:
    Jul 2, 2017
    Posts:
    4
    Hi there,

    I'm fighting a bit trying to come up with a proper way to create a hexagon-shaped graph that would show the player skills/attributes and be modified depending on their value. Let me show you a quick mockup of what I'm seeking:

    (Just made up those stats, but you get what I mean) Now, how could I achieve something like that? The way this works is simple, if the player tweaks a given skill, its 'vertex' of the green 'star' goes higher or lower depending on the skill's value.

    Initially, I tried to create something like this in the form of some sort of ProgressBar, but to no avail. Is there a way to set up that kind of menu using Unity's UI tools with some of the approaches discussed in this thread? Would manually creating a mesh by code work here?


    Thanks in advance.
     
  43. SimonDarksideJ

    SimonDarksideJ

    Joined:
    Jul 3, 2012
    Posts:
    1,650
    You can achieve this with the UIPolygon control in the Unity UI Extensions project.
    upload_2017-7-3_13-4-16.png
    Doesn't have the inside lines, but you could add those with the UI Line Renderer in the project, or enhance the control.
    (Link in sig)

    Hope this helps.
     
    MVS_ likes this.
  44. MVS_

    MVS_

    Joined:
    Jul 2, 2017
    Posts:
    4
    Thanks a lot Simon! I'm sure it'll help, as soon as I have some time to work on this I'll keep you posted.
     
    SimonDarksideJ likes this.
  45. M-P-F

    M-P-F

    Joined:
    Apr 22, 2015
    Posts:
    11
    Is there any way to get meshes created with CanvasRenderer to have decent edges? It actually looks very nice in the editor screen, but the play window and (worse) the actual build looks ugly and jagged around the edges (example):
    upload_2018-1-23_9-14-5.png

    This is using Unity 5.6.3f1, and the build is run with "fantastic" quality. Since it looks fairly decent in the editor, one should think it's got to be possible?

    I tried the UILineRenderer from the UIExtensions, but it has the same issue.

    Camera settings, in case they are the problem:
    upload_2018-1-23_9-18-20.png
     
    Coredumping and Circool like this.
  46. Umresh

    Umresh

    Joined:
    Oct 14, 2013
    Posts:
    55
    Hi,
    I have a mesh in meshfilter and I want to convert it to UI component (like Image component). I tried above methods it works but eventsystem's (pointer click etc is not working) raycast is not working. can anyone point me in the right direction.
     
  47. atr0phy

    atr0phy

    Joined:
    Nov 5, 2014
    Posts:
    13
    You'd probably have better luck posting a new thread. Especially considering how old this one is. But off the top of my head, everything you need to know about dispatching and responding to UI events is pretty well documented here: https://docs.unity3d.com/Manual/UIE-Events.html
     
unityunity