Search Unity

  1. Unity Asset Manager is now available in public beta. Try it out now and join the conversation here in the forums.
    Dismiss Notice

Feedback Wanted: Mesh scripting API improvements

Discussion in 'Graphics Experimental Previews' started by Aras, May 26, 2019.

  1. KillHour

    KillHour

    Joined:
    Oct 25, 2015
    Posts:
    49
    I'm working on a custom fork of the current HDRP package that will let me do this. I'll report back when (if) I get it working...
     
    deus0 and Mr-Mechanical like this.
  2. KillHour

    KillHour

    Joined:
    Oct 25, 2015
    Posts:
    49
    The good news - I was able to modify the HDRP shaders to optionally take a compute buffer for vertex data without breaking anything else (as far as I could tell).

    The bad news - shader graph is totally independent of that and has some crazy C# shader string builder that I'll need to figure out.
     
    BanJaxe and Mr-Mechanical like this.
  3. caioc2

    caioc2

    Joined:
    May 11, 2018
    Posts:
    8
    I've been testing with the new mesh slice set/get and the results are not good. (2019.3.0a5)

    My case is very simple: every frame a generate the "skeleton" of the mesh(variable number of vertices) and expand it in the geometry shader.

    Using the the Mesh class it takes ~16ms per frame to setup the data. Skipping the entire mesh class and passing the same data through compute buffers and using DrawProcedural takes ~0.7ms per frame.

    I have no idea how much copying/reallocating and checking there is inside the mesh class, but in its current form it is far away from the performance it should be.

    Looking forward for the SetIndexBufferData & SetVertexBufferData.
     
  4. Dingoblue

    Dingoblue

    Joined:
    Aug 23, 2018
    Posts:
    2
    Is there no version of SetTriangles that takes a NativeArray of ushort or int?
    Using:
    filter.sharedMesh.SetIndices(Triangles, MeshTopology.Triangles, 0);


    To work around this for now, but it seems like an oversight.
     
    Lynxed likes this.
  5. KillHour

    KillHour

    Joined:
    Oct 25, 2015
    Posts:
    49
    I modified the Shadergraph package to inject my custom vertex code into the generated shaders (right now, it's just a hardcoded triangle for testing). It works in the preview windows, but the main preview doesn't reflect it, and neither does using the shader in a scene. Does the master node get its vertex information from somewhere else? What is going on here? shadergraphh.PNG

    Edit:

    Oh crap, I know EXACTLY what's going on here. That "position" input is just completely replacing anything I do with vertex shaders. I'm a dummy.

    Double Edit:

    It should be possible to do all of this with just a custom function node if I could get
    SV_VertexID into it. I don't think that's possible right now, but I'm going to keep digging through the code. Might be a fairly easy request for the shadergraph team.
     
    Last edited: Jul 19, 2019
    LooperVFX likes this.
  6. KillHour

    KillHour

    Joined:
    Oct 25, 2015
    Posts:
    49
    I figured it out - Shader Graph does use the HDRP shaders, but ONLY for the actual output, not the preview nodes. I didn't actually have to modify the Shader Graph package at all. I just had to turn on the define I used to turn the procedural vertex piece on. I created a dummy custom function that didn't do anything except sneak a #define in there.

    Code (csharp):
    1. #define PROCEDURAL_MESH_ON
    2. Out = In;
    blob.PNG

    That's an unlit base shader - I'm using shader graph to lower the brightness based on the normal angle from the camera.
     
    Last edited: Jul 19, 2019
  7. alexnown

    alexnown

    Joined:
    Oct 25, 2018
    Posts:
    22
    This api is very useful for me. In my game I'm serialize flat mesh data (Vector2 [] vertices and ushort [] triangles) to BlobAsset, then at runtime casts BlobReference<T> -> NativeArray<T> and sets it to mesh. Is it already possible set vertices dimension to 2? Now I'm always get error:
    Code (CSharp):
    1. ArgumentException: SetVertices with NativeArray should use struct type that is 12 bytes (3x float) in size
    Will it be possible in future? It will be wrong serialize flat mesh vertices as Vector3 with always z=0.
     
  8. Lynxed

    Lynxed

    Joined:
    Dec 9, 2012
    Posts:
    121
    As soon as we get SetFoo, can we please have GetFoo(NativeArray<T>)? including GetTriangles(NativeArray<int>) and GetColors(NativeArray<float4>) pretty please?
    Having SetTriangles(NativeArray<int>) and SetColors(NativeArray<float4>) would be also awesome).
    It's a bit hard to count bytes and pointers.
     
    Minchuilla likes this.
  9. PetrS

    PetrS

    Joined:
    Feb 10, 2014
    Posts:
    2
    +1 for the above mention of unmanaged getters - GetXXX(NativeArray<T>).

    The copy-free internal data getters would be ideal (mentioned as "under consideration" in the doc), but if it turns out they are not feasible, a way to still access data without going through managed memory (i.e. just native->native copy) would be highly appreciated.
     
    Minchuilla likes this.
  10. Carpet_Head

    Carpet_Head

    Joined:
    Nov 27, 2014
    Posts:
    258
    Is there no way to get the vertex buffer as a NativeArray<float3>? even with a memcpy, it would be fine
     
  11. alexnown

    alexnown

    Joined:
    Oct 25, 2018
    Posts:
    22
    Аre there any examples with using SetIndexBufferData api?
    Tried it for setting triangles, but only SetIndices works correct.
    Code (CSharp):
    1. //Not correct, mesh.triangles array still has zero length
    2. mesh.SetIndexBufferParams(triangles.Length, IndexFormat.UInt16);
    3. mesh.SetIndexBufferData(triangles, 0, 0, triangles.Length);
    4.  
    5. //Correct way set triangles from NativeArray<ushort>
    6. mesh.SetIndices(triangles, MeshTopology.Triangles, 0);
     
    MUGIK and owen_proto like this.
  12. owen_proto

    owen_proto

    Joined:
    Mar 18, 2018
    Posts:
    120
    Seconding this. Documentation is a bit lacking. Verts seem to update fine but triangle array is empty.
     
    Last edited: Sep 6, 2019
    MUGIK likes this.
  13. RoughSpaghetti3211

    RoughSpaghetti3211

    Joined:
    Aug 11, 2015
    Posts:
    1,708
    My wish for the meshAPI is a more complete function set. If I compare it to something like maya MFnMesh you can see the difference. I know Unity is not a DCC but I want it to be one :)

    Mesh function set
    https://download.autodesk.com/us/maya/2011help/API/class_m_fn_mesh.html

    It would also be awesome to have some sort of Jobs iterators for meshes
    IJob...For<MeshVertex>
    Ill post mayas API below as a example of what I mean. Not necessarily the best but one im familiar with. Just my 2c

    Mesh Iterators

    https://download.autodesk.com/us/maya/2011help/API/class_m_it_mesh_vertex.html

    https://download.autodesk.com/us/maya/2011help/API/class_m_it_mesh_polygon.html

    https://download.autodesk.com/us/maya/2011help/API/class_m_it_mesh_edge.html
     
    Last edited: Sep 17, 2019
    awesomedata likes this.
  14. Nothke

    Nothke

    Joined:
    Dec 2, 2012
    Posts:
    112
    I tried using a custom vertex struct with Mathematics' half3 and VertexAttributeFormat.Float16, but I get

    ArgumentException: Invalid vertex attribute format+dimension value (Float16 x 3, data size must be multiple of 4)

    ..But it works with float3, which is a multiple of 3.. So why wouldn't it work with 3 Float16s?

    Also, if I put anything but 3 dimensions in VertexAttributeDescriptor(), Unity crashes. I guess this is very bad :|

    Code (CSharp):
    1. using UnityEngine.Rendering;
    2. using Unity.Mathematics;
    3. using System.Runtime.InteropServices;
    4.  
    5. [StructLayout(LayoutKind.Sequential)]
    6. public struct Vertex
    7. {
    8.     public half3 pos;
    9.     public half3 normal;
    10.  
    11.     public static VertexAttributeDescriptor[] GetVertexAttributes()
    12.     {
    13.         return new[]
    14.         {
    15.             new VertexAttributeDescriptor(VertexAttribute.Position, VertexAttributeFormat.Float16, 3),
    16.             new VertexAttributeDescriptor(VertexAttribute.Normal, VertexAttributeFormat.Float16, 3),
    17.         };
    18.     }
    19. }
     
  15. JesOb

    JesOb

    Joined:
    Sep 3, 2012
    Posts:
    1,109
    half3 = 6 bytes - not multiple of 4
    float3 = 12 byte - it is multiple of 4

    That is what message says

    try to use half4 = 8 bytes and ignore 4th element or use for something else

    May be this will not allow you to use half 4 :(
     
    Nothke likes this.
  16. Nothke

    Nothke

    Joined:
    Dec 2, 2012
    Posts:
    112
    Ahh bytes! The error message should say "bytes" instead of just "data size" which is confusing.

    Yup, I just changed everything to half4 and Unity just crashes.

    Strange since their example uses Vector3 (as 3x Float32), 2x ushort (as 2x Float16) and Color32 (as 4x UNorm8), so obviously conversion should be smooth
     
  17. JesOb

    JesOb

    Joined:
    Sep 3, 2012
    Posts:
    1,109
    I think you need to fire a bug about crash for half4
    I thought even to use some day UNorm4 as position because I dont need precision bigger than 256 for each component.
     
  18. Nothke

    Nothke

    Joined:
    Dec 2, 2012
    Posts:
    112
    Filed it!
     
    JesOb likes this.
  19. OlegPavlov

    OlegPavlov

    Joined:
    Jun 22, 2014
    Posts:
    13
    This update is FIRE!
    My mesh setup time reduced almost 16 times!

    When i call SetBuffer<T> does it just pass buffer to graphics api and don't store it? If it is it would be great! That would mean that user holds the data, and can do anything with it.

    But i still have my main thread filled with mesh setups.

    We'd like to see this fine vertex precision controls in Model Importer, so we could optimize meshes from editor without writing any code.
     
    JesOb likes this.
  20. OlegPavlov

    OlegPavlov

    Joined:
    Jun 22, 2014
    Posts:
    13
    bugs i found in 2019.3.0b3:

    SetIndices(calculateBounds: true) - doesn't calculate bounds (doesn't really bother me)

    SetVertexBufferParams - position of attribute inside layout array doesn't define struct layout, it looks like the order is predefined based on attribute type

    if using Format.SInt8 for VertexAttribute.Position Mesh.RecalculateBounds() crashes editor
     
    Last edited: Sep 23, 2019
  21. rsodre

    rsodre

    Joined:
    May 9, 2012
    Posts:
    229
    Not sure if this is the place, but would be nice to have PLA (Point Level Animaiton) support on Unity.
    I use Cinema 4D for modeling and animaiton, and the lack pf PLA limits a LOT the animation tools I can use.
     
  22. DrSeltsam

    DrSeltsam

    Joined:
    Jul 24, 2019
    Posts:
    101
    I love the new mesh API so far, this is really a major improvement! But it would be great if we could also call SetVertexBufferData() with a pointer to unmanaged memory (with an option to set the element size manually), same for SetIndexBufferData(). Something like SetVertexBufferData(void* ptr, int elemSize, int length, ...). Or direct access to InternalSetVertexBufferData() alternatively :D

    Currently my "workaround" is to create a NativeArray from my pointer temporarily (NativeArrayUnsafeUtility), assign an AtomicSafetyHandle (otherwise the mesh API doesn't like my NativeArray), then set the vertex buffer data and release the safety handle again. It works like a charm, but it would be nice if I don't have to take these extra steps. Apparently SetVertexBufferData() just gets the underlying pointer from the NativeArray...
     
    OlegPavlov likes this.
  23. TheZombieKiller

    TheZombieKiller

    Joined:
    Feb 8, 2013
    Posts:
    266
    Could we also get List<T> and T[]-based overloads for SetBoneWeights? Currently there's only a NativeArray<T> version, meaning I need to do this:

    Code (CSharp):
    1. using (var bonesPerVertex = new NativeArray<byte>(BonesPerVertex.Count,     Allocator.Persistent))
    2. using (var boneWeights    = new NativeArray<BoneWeight1>(BoneWeights.Count, Allocator.Persistent))
    3. {
    4.     MemoryUtility.CopyToNativeArray(BonesPerVertex, bonesPerVertex);
    5.     MemoryUtility.CopyToNativeArray(BoneWeights,    boneWeights);
    6.     mesh.SetBoneWeights(bonesPerVertex, boneWeights);
    7. }
    Which isn't super ideal. It'd also be nice to have a SetBindPoses method, as currently only the array-based property setter exists.
     
  24. iamarugin

    iamarugin

    Joined:
    Dec 17, 2014
    Posts:
    883
    I think you can also use Allocator.Temp here.
     
    TheZombieKiller likes this.
  25. o1o1o1o1o2

    o1o1o1o1o2

    Joined:
    May 22, 2014
    Posts:
    34

    Can you please look at this thread https://forum.unity.com/threads/how-to-cache-mesh-getnativevertexbufferptr.762197/#post-5075690 i have a question about what does GetNativeVertexBufferPtr() do
     
  26. Carpet_Head

    Carpet_Head

    Joined:
    Nov 27, 2014
    Posts:
    258
    We are having some trouble with the new buffer API's and mesh bounds:

    Code (CSharp):
    1. VertexAttributeDescriptor[] vertexLayout = new[]
    2. {
    3.    new VertexAttributeDescriptor(VertexAttribute.Position),
    4.    new VertexAttributeDescriptor(VertexAttribute.Normal)
    5. };
    6.  
    7. MeshUpdateFlags flags = MeshUpdateFlags.DontValidateIndices;
    8.  
    9. meshToPopulate.SetVertexBufferParams(finalNumVertices, vertexLayout);
    10. meshToPopulate.SetVertexBufferData(vertexBuffer, 0, 0, finalNumVertices,0,flags);
    11.  
    12. meshToPopulate.SetIndexBufferParams(3 * finalNumTriangles, IndexFormat.UInt32);
    13. meshToPopulate.SetIndexBufferData(triangles, 0, 0, 3 * finalNumTriangles, flags);
    14. meshToPopulate.SetSubMesh(0, new SubMeshDescriptor(0, 3 * finalNumTriangles), flags);

    Take this code snippet for example. I would expect that, because we do not set MeshUpdateFlags.DontRecalculateBounds, the mesh bounds would be calculated. However, they aren't. Is there a reason for this?
     
    sayangel likes this.
  27. MUGIK

    MUGIK

    Joined:
    Jul 2, 2015
    Posts:
    481
    Guys, did you find a solution to this?
    My triangles array is also empty, while vertex buffer data sets correctly.

    UPD
    Fixed!
    So to make it work you need to do setup in this order:
    Code (CSharp):
    1. _mesh.SetVertexBufferParams(vertexCount, MeshLayout);
    2. _mesh.SetIndexBufferParams(indexCount, IndexFormat.UInt32);
    3. _mesh.SetIndexBufferData(indexBuffer, 0, 0, indexCount, MeshUpdateFlags);
    4. _mesh.SetSubMesh(0, new SubMeshDescriptor(0,indexCount, MeshTopology.Triangles), MeshUpdateFlags);
     
    Last edited: Oct 24, 2019
  28. Kronnect

    Kronnect

    Joined:
    Nov 16, 2014
    Posts:
    2,906
    Any chance to set the SortingOrder when using DrawMesh?
     
  29. Kronnect

    Kronnect

    Joined:
    Nov 16, 2014
    Posts:
    2,906
    These new APIs are super useful addition and boosts any procedural generation. Very happy to see improvements in this area coming finally!
     
  30. vx4

    vx4

    Joined:
    Dec 11, 2012
    Posts:
    181
  31. TheZombieKiller

    TheZombieKiller

    Joined:
    Feb 8, 2013
    Posts:
    266
    Is there an example anywhere explaining how to do partial vertex buffer updates? I can't seem to get it to work. Am I correct in assuming that "meshBufferStart" should be set to the index of the first vertex I want to update?

    EDIT: Looks like we'll be getting even more new APIs in 2020.1: https://twitter.com/aras_p/status/1199595539425374208
     
    Last edited: Nov 27, 2019
  32. Kichang-Kim

    Kichang-Kim

    Joined:
    Oct 19, 2010
    Posts:
    1,011
    Hi. How about BlendShape? I checked new Mesh API in 2019.3 documentation, but there are no Native version API for manipulating BlendShape.
     
    TheZombieKiller likes this.
  33. SLGSimon

    SLGSimon

    Joined:
    Jul 23, 2019
    Posts:
    80
    It would be very helpful if all use of input List<T> could be replaced with IList<T>, which would allow custom lists (eg for pooling) and as a bonus allow System.Array.
     
  34. Kichang-Kim

    Kichang-Kim

    Joined:
    Oct 19, 2010
    Posts:
    1,011
    @Aras

    Hi. What happen when following situation?

    1. Set F16, dimension 4 (half4) for position attribute : In the shader, 4th elements are ignored?
    2. Set dimension 2 for normal : what value is set for 3th (z) element?
    3. UNorm8 for tangent : Is it valid for [0,1] range only? how about signed value [-1, 1]?, also how to set UNorm8 value from C# script? It seems that C# and Unity Mathematics doesn't have UNorm8 struct.

    2, 3 is described in Unity's official sample (https://docs.unity3d.com/ScriptReference/Rendering.VertexAttributeDescriptor.html)
     
  35. Aurelinator

    Aurelinator

    Joined:
    Feb 27, 2012
    Posts:
    19
    Since you guys are still taking feedback, I've got a suggestion or thought about something in a completely different direction. There's a lot of editor tooling that relies on making changes when things change. For example, I have a CSG tool that intersects meshes together dynamically in the editor - but if I go into my modeler and change the mesh, I'd like to know that the mesh itself was changed.

    So in my current case: I have a scene file that has a mesh reference and a "cache" that tells me how the mesh was earlier.

    Up until now, in order to decide if an "old mesh" was changed, I compare number of verts and submeshes, as that's the least expensive cache to maintain. If a user just moves a vertex, my tool can't tell. I have no way of caching some information about a given mesh, then later comparing it with the current instance of the mesh to see if something has changed. We've toyed around with using MD5 hashes of the mesh file, but that can get expensive.

    It would be really interesting to have a way of quickly comparing if two meshes are basically the same by having some sort of "mesh hash". That is, I can store the mesh hash, then later quickly recompute it to see if my mesh has changed and I need to do editor time recomputing.

    Basically mesh assets can change in a 3D modeler and there's no current way of knowing if the data you've computed from your mesh needs to be updated if the mesh has changed. A simple mesh hash would be great for "equality" comparison.
     
    awesomedata likes this.
  36. tteneder

    tteneder

    Unity Technologies

    Joined:
    Feb 22, 2011
    Posts:
    175
    Is it possible (with legacy or new Mesh API) to re-use VertexBufferAttributes amongst multiple Meshes without implicitely creating copies of it?

    If not, consider this a feature suggestion :)

    My use case in detail:
    I'm writing a glTF loader and some glTF files have just one giant vertex attribute array per file/scene that is referenced from hundreds of objects/meshes. With the legacy mesh API it seems that the giant buffer is uploaded to the GPU once for every Mesh instance, which has terrible performance and wastes a lot of RAM.

    Thanks!
     
    TheZombieKiller likes this.
  37. Kichang-Kim

    Kichang-Kim

    Joined:
    Oct 19, 2010
    Posts:
    1,011
    Current version (2019.3) of Mesh API is lack of native read methods. There is only API for getting Vector3[] or List<Vector3>. Is there any plan for adding method for reading values natively?

    It seems that GetNativeVertexBufferPtr() do this, but its layout does not match to Mesh.GetVertexAttributes().
     
    Vanamerax likes this.
  38. TheZombieKiller

    TheZombieKiller

    Joined:
    Feb 8, 2013
    Posts:
    266
    GetNativeVertexBufferPtr() gives you a pointer to the underlying graphics API's representation of the data, rather than direct access.

    However, the Unity 2020 alpha offers the functionality you're looking for: https://docs.unity3d.com/2020.1/Documentation/ScriptReference/Mesh.AcquireReadOnlyMeshData.html
     
    Vanamerax and Kichang-Kim like this.
  39. Aras

    Aras

    Unity Technologies

    Joined:
    Nov 7, 2005
    Posts:
    4,770
    Depends how you read it in the shader. If you read it as a 3-dimensional float, then yes the 4th component is just ignored. If you read it as a 4-dimensional float, then you get all 4 numbers. What you do with them, is up to you.

    Kinda similar. I think on all graphics APIs, the defaults for any vertex shader inputs are (0,0,0,1) -- meaning if the vertex shader tries to read a float4 but the data is only float2, then .zw get (0,1) values. In your example, if the data is only 2 numbers but you read it as float3 in the shader, then z is zero.

    UNorm8 is 0..1 range, yes (https://docs.unity3d.com/2019.3/Doc...e/Rendering.VertexAttributeFormat.UNorm8.html). SNorm8 is 8 bits with range of -1 to 1 (https://docs.unity3d.com/2019.3/Doc...e/Rendering.VertexAttributeFormat.SNorm8.html).

    Any C# struct that matches the data layout will do. For example, for 4-dimensional UNorm8 values, a Color32 struct matches the layout. Or your own struct with 4 bytes.
     
    a436t4ataf and Kichang-Kim like this.
  40. Aras

    Aras

    Unity Technologies

    Joined:
    Nov 7, 2005
    Posts:
    4,770
    You basically would want one vertex buffer to be shared between several individual Meshes in Unity? Today that does not work indeed. You can have one Mesh with multiple sub-meshes, but that's the only workaround I can think of.
     
    tteneder likes this.
  41. Kichang-Kim

    Kichang-Kim

    Joined:
    Oct 19, 2010
    Posts:
    1,011
    @Aras Hi. I found that non-Float32 format for position, normal and tangent mesh makes editor crash when using it for SkinnedMeshRenderer. Does skinning for non-float32 not supported yet?

    Minimal reproducible project are sent as Case 1215575.
     
  42. Aras

    Aras

    Unity Technologies

    Joined:
    Nov 7, 2005
    Posts:
    4,770
    If the crash is in CalculateBoneBindPoseBounds, then yes -- an oversight, already fixed in 2020.1, will get backported to 2019.3 soon (link to existing issue).

    That said, a bunch of CPU-based things in Unity explicitly don't support non-Float32 vertices, e.g. CPU skinning, dynamic batching etc. GPU skinning does work though.
     
    Kichang-Kim likes this.
  43. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    This concerns me.

    I'm working on building a world in Houdini using their new 64-bit vertex positioning information. I would like to bring my mesh into Unity with 1:1 positions with the correct 64-bit world tile positions in-tact so I can use them as scene and prefab loading positions.

    In practice, I would need a floating world origin to stream these correctly in-game, but I would like to know their actual in-world positions ahead of time (in-editor) to shift the meshes around easily so I can use tools like Probuilder and others to modify the gameobjects on particular tiles without losing information.

    Is something like this in the works?

    @Joachim_Ante mentioned a floating point streaming system is in the works, but I'm not sure if he meant that it applies to the in-editor workflows. Something like the fundamental mesh system would be involved at some point for workflows like mine.

    For example, I use a lot of procedural mesh generation tools in my work, but I want to move most of these to Unity. These procedural tools would have to be compatible (in some capacity) with incoming 64-bit mesh data in order for my workflow to be practical. What is the suggested process for integrating this kind of workflow into Unity? Would you be able to point me at a potential mesh workflow solution, @Aras? Right now, this seems impossible in vanilla Unity.
    Oh, did I mention those 64-bit mesh tiles would have to change shape, both in-editor AND at runtime? -- This is the foundation of my game's design, so I cannot fudge things on the technology side here...

    -- After all, all of those nice new procedural meshes I make in the editor using my procedural tools would have to be shifted back waaaay off in the 64 bit world space (even if they aren't displayed).
     
    Last edited: Feb 9, 2020
  44. Aras

    Aras

    Unity Technologies

    Joined:
    Nov 7, 2005
    Posts:
    4,770
    Unity does not support 64-bit positions natively anywhere. You could encode 64-bit data somehow, and then reconstruct that in a shader or whatever, but also keep in mind that like 99% of all GPUs don't support 64-bit numbers either.

    (not sure why each mesh would need 64-bit positions though... most if not all "large world" content that I've seen only needs "larger than 32 bit" world space positions of the objects, not for data within each individual mesh)
     
  45. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    Right. This is kinda a core issue. On top of the mesh issue, I also cannot author 64-bit positioned content without being able to also position it in 64 bits.

    Since Unity is never going to support 64-bit positioning at runtime, then I at least need the editor mesh data to be able to support 64-bit positioning during authoring time so that I can correctly (and easily) position even the largest and most distant of meshes and submeshes in my world straight from (and back to) my modeling package (i.e. Houdini) so I can iterate on the artwork without writing a separate, awkward, tool whose sole purpose is to convert positioning (and mesh!) data back-and-forth between 32 and 64-bits just for positioning verts!

    You have no idea how huge of a headache it is to deal with a 64-bit mesh conversion process when almost none of your standard tools work. I don't care if there is an entirely different editor-only mesh API that ONLY works in the Unity Editor for this purpose alone, but you have no idea the headaches it will save artists like me. Without support on the mesh level for 64-bit conversion (usually through proprietary tools and native Unity Editor hacks that vary from game studio to game studio and project to project), it's a total nightmare to do this kind of back-and-forth on open-world art without help (again, on the mesh level.) Not to get personal, but aren't you tired of seeing procedural open-world art that looks TOO procedural? Yeah? That's because back-and-forth mesh iteration is a BEAST without a custom mesh handling situation. Not every developer or small studio can afford somebody to write this kind of tool, but Unity can eliminate the need for it by simply offering a little helping hand via an (editor only) 64-bit data type for mesh data for back-and-forth mesh iteration and in-engine visualization purposes.

    The good news is that, again, artists really only need that 64-bit mesh data during authoring time -- Beyond visualization and iteration purposes, we can use mesh conversion (i.e. 64-bit global positions to 16-bit local positions chunked to disk for streaming) to handle the display portion of it (for both runtime AND editor navigation). However, the mesh itself should be able to be positioned 1:1 in a world during the back-and-forth authoring process in order to make it easy to transition between 64-bit modeling software and Unity without a whole other software to handle the mesh conversion when all you really need is the mesh to not lose its vertex data precision when its verts are positioned very far away from the origin. Again, visualization and iteration are key to the execution of a great design -- so help me out, yeah?


    See above. Again, I need to position the verts in world space in my modeling package for them to show up in the very same (correct) place in Unity so that I can both iterate on and visualize my world without losing any vertex or positioning data. I need the mesh data to survive the authoring process so that back-and-forth iteration is possible in the editor. Then, at runtime, I simply convert the 64-bit to standard Unity positions (mixed with the floating-origin system @Joachim_Ante mentioned, and stream them however I need to so they can be displayed properly. Then, when I jump out of play mode, the meshes are converted back to their 64-bit positioning situation for further iteration.

    Yeah?
     
  46. JiangBaiShi

    JiangBaiShi

    Joined:
    Aug 3, 2019
    Posts:
    27
    Hi, I think expose mesh apis for compute shader & buffers can be a good idea, I've watching the progress of GraphicsBuffer in newest versions of Unity, and I really hope there is a interface to bind custom mesh data from Compute/Graphics buffers with native mesh index&vertex buffers.
     
    Minchuilla, snacktime and awesomedata like this.
  47. tteneder

    tteneder

    Unity Technologies

    Joined:
    Feb 22, 2011
    Posts:
    175
    Jobified/Async/Threaded versions of RecalculateNormals and RecalculateTangents would be nice!
    A 4 million triangle mesh it takes ~1 second on my laptop to calculate tangents for. Blocking on the main thread, unfortunately.
    I'm seriously considering re-implementing MikkTspace in a C# job, but why not at engine-level.
     
    awesomedata likes this.
  48. tteneder

    tteneder

    Unity Technologies

    Joined:
    Feb 22, 2011
    Posts:
    175
    Might be good to share:
    I haven't ported my code to the new API yet, but I saw that setting vertices, UVs and normals the old way became twice as slow now:

    2019.3.3
    upload_2020-2-27_12-11-36.png
    2018.2
    upload_2020-2-27_12-11-54.png

    Probably one more reason to port it and re-profile.
     
  49. cultureulterior

    cultureulterior

    Joined:
    Mar 15, 2015
    Posts:
    68
    A few suggestions around the new mesh API
    1. Right now there's a disconnect between the new mesh API, which prefers to give you `ushorts`, and wants to convert to int, and the new dots physics API, which prefers to consume `int3` Perhaps a `ushort3` standard would be in order?
    2. It would be great to have Resources.Load support
    3. SetVertexBufferParams isn't burst compatible
     
    Last edited: Mar 22, 2020
  50. Minchuilla

    Minchuilla

    Joined:
    Feb 28, 2014
    Posts:
    16
    Would be nice if we could set the default 'SetVertexBufferParams', i.e. the layout of the buffer, for when meshes are imported.