Search Unity

Probuilder Runtime Help

Discussion in 'World Building' started by Rickmc3280, Aug 27, 2020.

  1. Rickmc3280

    Rickmc3280

    Joined:
    Jun 28, 2014
    Posts:
    189
    Hello,

    I have been messing with Probuilder scripts for a couple of weeks trying to figure out how to get similar functionality to Probuilder Editor in VR at runtime.

    For instance if you click the Object Selection button, it enables object selection in the scene... if I have a 3D Gizmo/Handle attached to it such as movement/scaling/rotation I would hope that the objects gizmos would then appear when I press a button that simulates me clicking Object Selection. Further I would be able to adjust and do stuff to the object. Such manipulations would be similar to the DragMovements functionality (which is more difficult in VR because of cameras and rays, so therefore a Drag_ColliderMovement functionality where if the handle collides with controller and controllerbutton = true, then the item adjusts with the transform of the collider and such updates can be visualized frame by frame instead of it bogging down the system while waiting for release of button).

    The same goes for Vertex Selection, Edge Selection, and Face Selection.

    What would be the process using these scripts?

    Not to be offensive, but upfront... I've read several posts with people asking similar questions and have seen Unity Moderators respond, and ultimately they are unhelpful and unwilling to go any further than just saying try the example scenes in the Runtime section... so I have, and they lack major content. Due to this lack of information I wonder if Unity is purposefully witholding information due to agreements with major CAD companies? Not entirely sure, but I dont want to rewrite the whole system if the API is there... and if its just a day or two or week even of learning/programming to get it setup. If the content is there, why not support one of your greatest potential assets. (Unity did buy it for a reason correct?)

    Can Unity Staff create a VR PB Example Package that (also non VR would be great too):
    (Also please note that I am aware of the VR Edittime based plugins as well and they were not particularly helpful either).
    1) Allows the user to select (Hand Tool, Move Tool, Rotate Tool, Scale Tool (Rect tool is probably pretty useless for VR and therefor a lot of its associative functions are indeed useless unless they can be redesigned for the XR system rays and XR Gestures/Movements).

    2) Probuilder Primary Functions - Object Selection, Vertex Selection, Edge Selection, Face Selection.

    3) Extended Scripts such as (PB Object Manipulator Object) that stores the object that has been selected and gives the object all of the major components to perform operations (find verts, adds handles for manipulation, adds primary gizmos for movement and transforms... etc etc).

    I've created Menus that allow me to select such options but they only work with the Unity side of things and not with ProBuilder (cant figure out how to emulate such options in PB either).

    I know most of these functions are there and available and I am happy to work with the current system (although I would love to see some simplification and improvements), but I cant even figure out where to begin with... even using the examples. Just for an FYI, I have handwritten the scripts out while working with them to try to further understand their functionality/usage and it still doesnt help (plus it seems that the examples only cover about 2% of PB functionality).
     
  2. kaarrrllll

    kaarrrllll

    Unity Technologies

    Joined:
    Aug 24, 2017
    Posts:
    552
    Hey, I'll try to address everything here but please let me know if I miss something.

    Are you referring to the EditorXR project here? If so, I'm not sure what their state of support for editor plugins is. I would imagine that it would require us to write VR compatible handles and versions of the editor tools.

    ProBuilder as a project is split into two distinct parts: runtime and editor (corresponding to the assemblies "Unity.ProBuilder" and "Unity.ProBuilder.Editor"). The public part of the API is for the most part limited to the runtime.

    The runtime API doesn't include much in the way of tool authoring. It is focused on mesh creation and editing.

    The editor assembly is what contains the tooling. These are things like handles, tools, and wrappers around mesh operations called menu actions. There are some public endpoints here, but for the most part we are supporting the runtime mesh editing API.

    The reason that these are separate is because nearly all of the "tools" in Unity are specific to Editor code, and hence unavailable at runtime. I made efforts to keep as much code as I could in the runtime, but in most cases relating to handles and scene view interaction it was simply not possible due to the use of `UnityEditor` API.

    Are you familiar with Editor XR? https://unity.com/editorxr

    We did a prototype of ProBuilder in Editor XR a few years ago. At this point it's going to require some updating, but it is at the least a starting point.

    https://github.com/Unity-Technologies/probuilder-vr

    Unfortunately there are no plans at the moment to start work again on ProBuilder in VR. We're a small team, and our focus for now is almost entirely stabilization, and bug fixes.

    Apropos to my comments above, the functions that drive these features are for the most part in the runtime API, but the UI is not.

    The Runtime Editing example shows this in the Utility script - https://github.com/Unity-Technologi...s~/Runtime/Runtime Editing/Scripts/Utility.cs

    There are functions to pick vertices, faces, and edges. Now, once you have that selection it is up to your game or software to decide how to store and render that information. Seeing as the UI is something that varies greatly from game to game, there is no standard implementation here. You can of course always reference the editor drawing code for inspiration https://github.com/Unity-Technologi...ster/Editor/EditorCore/EditorHandleDrawing.cs

    The runtime editing sample linked above demonstrates selection and manipulation, as well as a simple example of element rendering. You can round it out by referencing the `EditorHandleDrawing` to see how we are drawing elements in the Editor.

    Let me know if there's anything I can clarify, or if there's something in the Samples that is not clear. Although unfortunately we don't have the bandwidth to implement a full featured mesh editing sample scene, I would definitely be happy to make some improvements to the existing sample code.
     
  3. Rickmc3280

    Rickmc3280

    Joined:
    Jun 28, 2014
    Posts:
    189
    I downloaded and tried Editor XR, however it crashed a lot as well as freezing every 10 seconds or so. I had mentioned it solely to express that I had looked into it as well to try to figure this out (seemed like it was designed around runtime based on forum discussions).

    some questions are answered in that it seems runtime lacks manager type systems? Mesh operations are all individual and not tied to a system approach, ex. (Design, Parameters, Create/Build, Edit/Modify, Export/Save)?

    in this case, more questions arise, considering edge face and handle tools exist. (It led me to believe that there was some management behind the scenes (edit mode activated, wait for object selection, populate based on tool selection)).

    With that said how can I use the generator in case of cubes to draw the verts and on 4th the bottom faces are created, and then activate extrude? I get the need to implement functions in the scripts, but what is available in terms of wait for next click, add gameobject, wait next click, add... etc etc. in such a way that it pieces the part together gradually instead of all at once. Assuming... rewrite the code?

    also... the dragging functions all operate on ray casting with camera and mouse, how can Imdo the same with controller collider instead? Does any script available use this?

    Also, for most of the features, what scripts should be in the scene or on the gameobject? In the instance that I create my own obj, edge, vert, face tool activation event system, how does detection work? Another personal implementation? The reason I ask is that I have several scripts that do the same thing in terms of procedural shapes at runtime. Does PB runtime offer not much else?



    regarding optimization, many of the scripts are repetitive, suggesting inefficiency. Seems like structs and interfaces could reduce number of files by 50% at least and put most of the utilities together. Could refine the tools to be more vague and to operate on more classes (maybe more polymorphism) Also systemizing theprocesses could make it vastly more understandable. (Ex creation, modification, subsystem modification, push/update, render).

    I think my major hurdle moving forward will be editing object with colliderrd instead of rays, what are my limitations with scripts provided.

    lastly, where in the scripts can I modify insertion vector (transform pos)? It always puts them 0,0,0 for me?
     
  4. Rickmc3280

    Rickmc3280

    Joined:
    Jun 28, 2014
    Posts:
    189
    Also I should note...

    I purchased this asset for the runtime functionality and he reverse engineered ALL (well maybe I should say most) of Probuilder into a Runtime Asset, except due to renaming scheme... and serious lack of comments/summaries it is even harder to follow.

    Maybe yall could buy a few seats from him and have a go with it for a redesign xD

    It is called "Builder" and is a tool within these tools - https://assetstore.unity.com/packages/tools/modeling/runtime-editor-64806

     
  5. kaarrrllll

    kaarrrllll

    Unity Technologies

    Joined:
    Aug 24, 2017
    Posts:
    552
    Yes, exactly.

    There is, but for the most part it is tied to the Editor namespace. In most cases you should be able to extract it without too much trouble, however we have not tested this and therefore it is not made available in the public or runtime API.

    This "glue" code is really dependent on your game or project setup. You can copy what we have done in the Editor, but there are still going to be some gaps in the run loop that you will need to author to fit your needs.

    Yes, you can reference the ProBuilder VR project I linked earlier. The demo uses VR controller input to select and move faces.

    It sounds like you're asking about Selection, which ProBuilder runtime actually does have built-in. See the documentation for ProBuilderMesh. https://docs.unity3d.com/Packages/c...ngine_ProBuilder_ProBuilderMesh_selectedEdges

    To select objects, use the SelectionPicker class https://docs.unity3d.com/Packages/c...i/UnityEngine.ProBuilder.SelectionPicker.html

    I'm not sure what you're referring to here?

    There are no limitations, as the mesh modification API does not make any assumptions about how the input is handled. We have an Editor implementation that works with a mouse and cursor, as well as some API like the SelectionPicker to make simplify the creation of alternative implementations.
     
  6. Rickmc3280

    Rickmc3280

    Joined:
    Jun 28, 2014
    Posts:
    189
    1. 1)Yes, you can reference the ProBuilder VR project I linked earlier. The demo uses VR controller input to select and move faces.
    2. It sounds like you're asking about Selection, which ProBuilder runtime actually does have built-in. See the documentation for
    3. ProBuilderMesh. https://docs.unity3d.com/Packages/c...ngine_ProBuilder_ProBuilderMesh_selectedEdges
    4. To select objects, use the SelectionPicker class https://docs.unity3d.com/Packages/c...i/UnityEngine.ProBuilder.SelectionPicker.html
    5. I'm not sure what you're referring to here?
    There are no limitations, as the mesh modification API does not make any assumptions about how the input is handled. We have an Editor implementation that works with a mouse and cursor, as well as some API like the SelectionPicker to make simplify the creation of alternative implementations.

    First of all... sorry for the long response...

    1) Thanks, I will look more into this and the editor stuff. I looked through them but was hoping it would be a bit more straightforward

    2) Yes I am, thank you for explaining. More on #4

    3) --

    4) This is specifically what I am referring to from #2 - Selected Edges/Selection Picker. then there are several scripts related to edges, faces, etc etc. --- It is not apparently clear which type of process to use for what, so I have to figure out which script to use ---, and even so in my case, its not even in that script... it is somewhere else like handles etc and related to another type of process. Would be more practical with an implementation such as

    Selection_System

    1. SS_Faces
    2. SS_Edges
    3. SS_Vertices
    4. SS_Handles

    etc etc etc. OR


    PB_System
    1. Editing (Mesh based)
      1. Selection
        1. Faces
        2. Edges
        3. Verts
    2. Creation
      1. Primitive Shapes
        1. (Generator) Templates
      2. Generator - (the function recieves template) - basically just like the ShapeGenerator
      3. Advanced Shapes
        1. PolyLine
        2. Custom 3D Shapes
      4. Advanced Shape Generator (users such as myself have to create their own custom templates/rules- but is entirely separate from the generic class).
    • Implementations
      1. Materials (Rendering/Changing Mats/ and all functions related)
      2. UVs similar to above probably
      3. Place @ Location (Local/World/Coordinates)
      4. Unity based stuff
    The way that Probuilder is written is like someone just took a bunch of great ideas was holding them all in their arms and then tripped up and spilled it everwhere (the creation of it got a little out of control). (Ex, Handle Constraint 2D, Handle Orientation, Handle Utility)
    These types of things should fall into other categories and be better organized.
    ex... Handle.cs, Handle_Processes.cs,
    Selection.cs, select_faces.cs, select_edges.cs, select_vertices.cs


    ProbuilderMesh/ProbuilderMeshFunction/ProbuilderMeshSelection then there is MeshHandle/MeshHandles/MeshUtility (how is mesh utility different from mesh functions) also what is the difference between the tools. Also... MeshOperations? Its way too spread out. Better organization could allow quick and easier implementation and easier to understand for users... as well as make it easier to maintain at a later date.

    Or maybe change the folder structure?

    Core:

    ProBuilder_CoreProcesses
    1. ProBuilder_Tools
    2. ProBuilder_Processes
    Object_PBToos


      • Object_Tools
      • Object_Proccesses
    Vertices_PBTools
    1. Vertices_Tools
    2. Vertices_Processes
    Face_PBTools


      • Face_Tools
      • Face_Processes
    Edge_PBTools


      • Edge_Tools
      • Edge_Processes
    CreationSystem_PBTools (shape templates, generator)


      • Creation_Tools
      • Creation_Processes
    GeneralFunctionality (rendering/interfacing etc)


      • Vertices_Tools
      • Vertices_Processes

    The core concepts behind the App seem to be

    Create Objects,
    Edit Objects - Modify/Update, Move/Rotate/Scale, ApplyBooleanFeatures
    Implement PB Objects into Unity-

    You could easily combine scripts and add more /// Summary or distinct separations

    Better yet, structure the file so that it flows and makes sense - considering that the major components are Faces/Edges/Verts which are subsets of objects/meshes - they could all share a similar design pattern for the the file structure


    For Example:
    Verts are constructed from floats and have handles(in edit mode) (physical representations),


    Edges have verts and are represented by lines between a set of verts,

    Faces are a set of verts AND Edges (well you know) with sets of lines to represent a mesh (not always, but generally for most use cases).

    Objects Are comprised from all of the above.
    Think Pyramid - (Verts are the base - Objects are the tip) - Also think pyramid as in Verts and Edges are below the surface where faces and Objects are above surface. Top down/bottom up processes. Designing it this way will more than likely make it easier on the processor the create in editor and runtime.

    (What my point is here is that Verts are Verts, they cannot be edges, but edges can be made of verts and faces can be made from them etc. So when designing the file system, build and design it around Verts, Then Edges, Then Faces, and finally Objects. You could combine a lot of the scripts and have the functions build up from other scripts instead of multiple individual pieces).


    My apologies for a really long and lengthy reply. I do appreciate you helping me to understand how PB works so that I can use it. I guess my point though is that it shouldnt be so hard to understand, and if it wasnt so hard to understand...
    1) There would be much better runtime asset tools as people create their own assets that work with PB
    2) A whole other genre of developers using unity (there already are, but it could take off much faster).


    Plus... there seems to be a lot of extra overhead. you could most likely have just 1 core system and have the editor use it in one capacity and then runtime in another. (* Add a few limited Scripts for Editor Functionality*) Designing a system based off of its core concepts also makes development more intuitive and easier to take off with. (Think well written scripts dont need explanations and require less tutorials for needy people such as myself)

    - also note that the documentation is limited for runtime functionality and i have yet to find a single Runtime implementation (Video or Decent Written Documentation) (video tuts are how I learn best and how others do too- ( the human mind typically does better with using physical/visual/ and auditory sensory input to encode it into memory better(all of it required at once - it strengthens the stimulation and memory) why I loath written documentation only). I admit the samples helped a little bit, but didnt work for my purposes, or maybe I just couldnt wrap my mind around it fast enough to make it stick.

    The only decent thing out there it the asset I mentioned above and it is a COMPLETE redesign of how PB works... so not... exactly relevant, although I would prefer to try to make his work, but still running into issues because of the VR/Collider issue. (no usage of Rays). Although even that version hits the frame rates pretty hard.

    Most of the features are specific to editing verts/edges/faces vs PB Objects

    https://imgur.com/a/MouubyG
     
    Last edited: Sep 2, 2020
  7. kaarrrllll

    kaarrrllll

    Unity Technologies

    Joined:
    Aug 24, 2017
    Posts:
    552
    Thanks for your thoughtful reply.

    As this is a project that has grown and evolved over several years, it's not too surprising that there is some sprawl. Yes, there are certainly many parts of the codebase that could be better organized. The spattering of handle classes in runtime is a good example of that. This is part of the reason for why so much of the code is not in the public API, as we want to make sure that the foundation will not significantly change after it is already in the hands of developers.

    However, the general approach of a Core, Mesh Operations, and Editor Integration namespace has been consistent and is unlikely to change.

    With regards to organizing by mesh element, many of the mesh operations are written in such a way as to be agnostic of high-level selection. They accept lists of indices or edges, regardless of whether the user has faces, edges, or vertices selected. Additionally, there is some complexity in that these systems need to be not only flexible, but also performant. There are places in the codebase where code is repeated with different parameters or implementations because the current state may allow or disallow certain shortcuts. I'm not claiming that there is no room for improvement (there certainly is, and we regularly refactor when addressing bugs or implementing new features), but rather pointing out that often there were valid reasons for these design decisions.

    In terms of directory organization, I tend to avoid many directories with less files because personally I find it more cumbersome. However I do see the opposing point of view :)

    It seems like we could really do a better job with providing an entry point for developers new to the project. In the [readme](https://github.com/Unity-Technologies/com.unity.probuilder/blob/master/README.md) we have just a cursory snapshot of the architecture. Expanding on this guide, as well as increasing the scope of some of our runtime sample projects woud go a long way towards improving this situation.
     
  8. Rickmc3280

    Rickmc3280

    Joined:
    Jun 28, 2014
    Posts:
    189
    At the very least, this would be great. I do understand the complexity of it all(for yall having to rework stuff).

    Im just really big on the concept of "Simplicity Perpetuation". The simpler it is, the easier it is for the idea to perpetuate into fruition. With ProBuilder, I am working on a project that would strongly benefit from it but I am finding myself having to create an entirely new version of it, and I think an implementation would open the possibilites for a new variety of tools and a new revenue stream for Unity. Altho... I dont want a crazy amount of competition when Im done so maybe its best that it stays this way xD
     
  9. jh4483

    jh4483

    Joined:
    Oct 25, 2021
    Posts:
    20
    Hi!

    I'm trying to use Probuilder for VR. Since you are discussing this, do either of you know why my shape generator script is not generating pb meshes in VR? I'm able to see these meshes in game view, but not my quest. I've tried disabling script stripping too. Any suggestions would be really helpful!
     
  10. Rickmc3280

    Rickmc3280

    Joined:
    Jun 28, 2014
    Posts:
    189
    Are you using URP? If so I've had issues with other unrelated Unity Functionality. The work around was going to one of the URP assets and enabling certain features.
     
  11. jh4483

    jh4483

    Joined:
    Oct 25, 2021
    Posts:
    20
    Thank you so much!! Let me try that!!