Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. We have updated the language to the Editor Terms based on feedback from our employees and community. Learn more.
    Dismiss Notice

UI Builder plans

Discussion in 'UI Toolkit' started by hacman, Jan 30, 2020.

  1. hacman

    hacman

    Joined:
    May 1, 2014
    Posts:
    1
    I have played with ui builder and I was wondering what the plan for it was going forward.

    As an artist html and RSS is not an intuitive framework. I have had to learn how to work with it and it's own quarks as we use it in our studio. I think it has a lot of flexibility, but it can be tricky to get to where you want to, as there is often more then one way to do the same thing and editing a small property can destroy your layout (position relative to position absolute).


    It seems to me like UI builder is a possible solution to this. As i played with it though, there seemed to be a lack of interactive ways to control it in the view menu and more emphasis in the property editor. This feels clunky and slow at the moment. There are programs like web flow and framer that let you build things a bit easier. (Framer is a bit of a stretch)

    Ideally i can imagine the ui builder to be a bit smarter. Keeping the element in the same position when going between position absolute and relative by adding margin or position values. Visualizing the box model in the view. Being able to convert from margin to position. Drawing a rectangle in screen for a new element like in photoshop or sketch. Etc.

    There are very simple things an artist might want to do that are tricky in html css. For example, as an artist I want to align multiple elements with a click. I want to always be able to move an element. I want to group two elements together, etc.

    And thus my question overall is this, how far is ui builder planned to be made and updated for an artist workflow?
     
  2. benoitd_unity

    benoitd_unity

    Unity Technologies

    Joined:
    Jan 2, 2018
    Posts:
    328
    Hi hacman,

    The plan for the UI Builder is to be a tool that UI artists can use to produce production-ready game or application UI.

    We're very early in the development of the tool but our focus until 2020.1 is to improve the user experience and make it more user friendly. Just like you said, it's currently missing a lot of expected workflows and commands for this kind of tool and a lot of them will be resolved when we release the next major update.

    After that, we'll keep adding more and more quality of life and productivity improvements so that building UI in Unity is efficient and enjoyable.

    Cheers,
     
  3. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    What is the ETA on the next major update? -- In particular, the Editor Tooling side of things.

    I am hoping EditorWindow-style tool-building using UI builder is being considered as heavily as game-ready UI is in the design. I mean, I'm not saying I wouldn't _enjoy_ animation-capable property-binded EditorWindows, but I do prefer an easier experience in making nice looking panels and interactive scene gizmos / tools.
     
    Last edited: Jan 31, 2020
    uDamian likes this.
  4. benoitd_unity

    benoitd_unity

    Unity Technologies

    Joined:
    Jan 2, 2018
    Posts:
    328
    When 2020.1 releases, which is around May if I remember correctly.
     
    awesomedata likes this.
  5. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    Thanks for that.
    Would you mind answering the other part of my question in terms of scope of the UI Builder for EditorWindow features?

    This part:


    For example, is it possible to use DOTS Visual Scripting to build EditorWindow logic in-tandem with e.g. UI Builder, Scene Gizmos, and Shortcuts?
    Ultimately I really want an easier/smoother way to build my game development tools than having to rely on Houdini for it all. I would think UI Builder should be able to accommodate that. Is this the case?
     
  6. benoitd_unity

    benoitd_unity

    Unity Technologies

    Joined:
    Jan 2, 2018
    Posts:
    328
    Oops my bad. Well at its core, the UI Builder is a UXML/USS authoring tool with little concern of how they're used, at runtime or for the Editor. We're still investing at optimizing each scenario though, whether it's via workflows, templates, etc…

    As for the UI framework itself, the goal is to be compatible with other Unity features. So you can imagine creating a piece of UI with a custom material done in ShaderGraph, responding to the new Input System and triggering logic in Visual Scripting, all working nicely together. We're not there yet but we're progressing into that direction.
     
    awesomedata likes this.
  7. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    Thanks for that! -- It alleviates a lot of my concerns!

    I know you're probably not the guy to ask about this, but if you don't mind, I'm curious about the Shortcut Manager features and its interaction with UI / Gizmos in the EditorWindow and Scene spaces.

    Since it is somewhat related to UI, and UI Builder (or its output) would have to know about it at some point, I was wondering if the new Input System was going to be able to work in all those UI-related contexts in terms of being able to interact with User-defined Shortcuts, etc. Is this kind of interaction going to be built into the core, or who would be responsible for making sure these scenarios get integrated? I know at some point you guys were talking about a better system for handling workspaces (i.e. like Blender has different UI for modeling, texturing, and sculpting), but I'm wondering if we're going to have to wait a long time for this to come first, or if anyone is in the process of making better shortcut/gizmo/input interaction happen in the near-term for the scene/window views?
     
  8. ilyaryzhenkov

    ilyaryzhenkov

    Joined:
    Mar 4, 2017
    Posts:
    6
    Just a small note – the inspector in the UI builder being separate from main Unity inspector is very uncomfortable on a small screen (laptop). Is there a reason to have it separate, or is it just work to be done?
     
    awesomedata likes this.
  9. benoitd_unity

    benoitd_unity

    Unity Technologies

    Joined:
    Jan 2, 2018
    Posts:
    328
    I'm actually the guy! What you're saying is interesting.

    So the Shortcut Manager is designed to respond to some Commands registered in Editor code, while the Input System maps device inputs to player events. It would be great to eventually blur the lines between Editor and Player, essentially do authoring at runtime, and we'd have to look at how to make those two systems work together.

    About Workspaces, we're still actively working on this, doing user research, UX explorations and prototyping. This is a rather deep workflow change for the editor so we're being very cautious and working hard at making sure we come up with the right solution.
     
  10. benoitd_unity

    benoitd_unity

    Unity Technologies

    Joined:
    Jan 2, 2018
    Posts:
    328
    This is a temporary situation caused by the nature of UIElements. The current Hierarchy and Inspector panels are designed to work with SerializedObjects, which is not what UIElements is using. That's why we had to re-build everything for the UI Builder.

    We have a plan though to modify the Inspector so we can use it other scenarios and we'll then convert the Builder to support it.
     
  11. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    I'm curious as to which part! -- I don't mind elaborating on my potential use-cases.


    To be fair, Workspaces are a much-needed workflow change. The only limitation I'd be careful of is making some things only accessible from a particular Workspace. They should be completely modular, but easily-referenced. For example, when making Snapcam, I had a huge issue with window-referencing at first since I wanted to make a multi-window tool that relied on other windows (e.g. the main "Snap" Navigator Window was the Save/Load/Miscellaneous-function HUB, while all the other "Navigator" windows tended to revolve around actually utilizing the "Snaps" generated and supplied by the Hub EditorWindow for the purpose of "snapping" to a point anywhere in the world/scene/level instantly, to help with navigating large worlds for level design purposes.) Relying on the EditorWindow names were difficult and spotty at best because the "main" window had a lot of data associated with it (since it made sense to design it this way for easier programming). This, however, proved to be a very difficult decision and I regret making it because Unity's UI/EditorWindow/Serialization/etc. all fought me on it every step of the way. Additionally, one of my "windows" was not even an "EditorWindow" at all! -- It was actually an abstract place to store invisible data and camera controls to allow me to manually shift the SceneView camera around (i.e. I had reprogrammed the Editor's SceneViewCameraController so I could customize it since it sucked so bad!)

    This unique setup is one I'm still struggling with today. I want to add node editors and zooming to my windows as well as invisible data and constantly-updating data that relies on SceneView gizmos and user-defined shortcuts to work with my various "Navigators" (and this is driving me nuts while trying to do it all with IMGUI). I have a background image on one of my EditorWindows that must zoom in/out, but I have controls that should stay at a single zoom. This is the Node-Graph functionality I was referring to. The node graph has nodes that aren't actually nodes, but I have a special toolbar at the top and a background that sometimes should zoom and an overlay of controls/tools that don't (or shouldn't) zoom when the user rolls the mousewheel over the window in particular places.

    We also need more gizmos that work with curves in the SceneView. I can imagine a tool that I drag/draw a curve out and it generates a wall facade with tiled windows, etc. -- basically something I'd normally use Houdini for, but would prefer to do in Unity natively with something like DOTS.

    This is getting pretty long, but I think you get the idea of just how much I would like to do with the UI, but I am so totally and thoroughly limited by the design and API as it currently stands.

    If you need any input, please let me know.

    I am a one man tool-design army.
     
  12. uDamian

    uDamian

    Unity Technologies

    Joined:
    Dec 11, 2017
    Posts:
    1,203
    Just wanted to note that before you try and move all the work you've done from IMGUI to UIElements, you can start using UIElements for some of these hard problems in IMGUI. Zoom is the lowest hanging fruit to start using UIElements. You basically get it for free. Just put what you want to transform inside an IMGUIContainer and put everything else on top. Then, all you need to do is change the transform of the IMGUIContainer.

    As for the rest of your use cases, you are definitely push Unity's customization support to its limits! As we work on Workspaces, it would be use cases like yours that we'd want to consider. The UI Builder was not much different in concept, but we choose to go with a monolithic Window instead of trying to make multi-window work. But we understand the difficulties involved.

    One thing I was curious to understand better from your example is:
    This happens to be one of the primary requirements for UI Builder to go multi-window. It was because there was no such limitation that made it hard to use. Most of the panes in the Builder don't make sense to have on their own, especially since the selection context is not Objects, but rather UI elements.
     
  13. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    I try. I try. :)

    This is actually true about my intent with Unity as a whole; I just don't think @willgoldstone or @Joachim_Ante (or any of the other Unity powers-that-be!) have been able to fully grasp my true ambitions to push the Unity Editor as far as it can ever possibly go in this regard. And we won't even mention my ambitions with the user-experience... Mwahahahaaaaa!!




    To clarify a bit -- I was mainly referring to the data itself (not the windows). For example, the multi-window approach you're speaking of for UI Builder, I faced head-on when trying to design Snapcam NAVIGATION STUDIO. It deals with a lot of non-visible data that I would like to be able to access from other Workspaces (these are more like "sub" workspaces I call "Navigators" which are designed to help you navigate your game and its data). For the most part, the "cam" part of Snapcam relies on hidden data. I originally made this hidden data EditorWindow-specific and / or stored in a struct (plus some files) for easy/global access so that I could easily access any of it on a per-scene basis, but I later found this to be limiting since I had planned to use NAVIGATION STUDIO on an open-world game design. Sadly, a full-on redesign was required (since the original Snapcam was mostly a prototype), but neither the UI nor the Unity Editor itself was up to the task of an open-world or in-depth UI and UX cases like mine, so I'm still in a holding-pattern for the Unity Editor (as well as the overall design of Unity) to catch up to me.

    I talked to @willgoldstone about some of these issues sporadically, scattered about bit-by-bit in various forum (and blog and youtube) posts, but I never got into the guts about everything that I'm trying to accomplish with the Editor.



    Some further information about Snapcam:

    In order to simply persist my custom UI texture data (generated via scene camera screenshot into a render texture) to keep my UI data between scene loads, I had to write my own multi-threaded texture file reader, then build this data into my custom file exporter/importer to persist my texture data, and then make this work with all the other types of data that was already built-in to Unity since I still needed Vector3 support (and who knows what else). Needless to say, even the basic prototype of my original Snapcam asset's UI data was a hassle thanks to Unity not having any of these utilities (or API) at the ready to handle cases like this. And that was just for the pretty thumbnails.

    When I was ready to get into the shortcut functionality to handle the SceneViewCameraController, I was ready to tear my hair out. I didn't realize just how separate and hacked-together even official Unity tool and editor shortcuts/events were (take the Terrain Editor tools or the even the official default SceneViewCameraController in the Editor itself!)... Sure, these were "functional" in their (very specific) use cases -- but, with the current IMGUI, the more general-use a case is, the more likely that case requires special knowledge of how to manipulate the API in ways that are (still) not widely known (or even available!) to regular users to this day. This is sometimes because the methods in question are, for example, still locked behind "internal" keywords. And as such, cannot be overridden or changed or even used. A case like my View Navigator, for example, requires access to all the fancy tricks in the SceneView CameraController's methods/classes -- classes that I (whether by chance or design) cannot override or adjust the way it behaves in any way. On top of this, the instant-action key shortcuts with standard letter keys (like WASD) are not possible via vanilla "userland" Unity API. They rely on the repeat-rate specified by the OS, which kills any chance of making my own realtime tools like a camera controller or modeling tools or workspaces using standard letter shortcut keys to control them (keys like WASD or RGB or whatever). This kind of thing seriously needs to be considered heavily. With the advent of more VR-centric creative tools, Unity will need this kind of realtime key event thing going forward anyway.

    Beyond this, I had a pop-up window (the Grid Navigator) that I also wanted to be dockable (if the user preferred to dock it). This would allow me to make a less-compact version of the Snap Navigator, and migrate everything from the main Snap Navigator window to this new Grid Navigator window (since the two sort of did the same thing -- but in a very different way as far as UX is concerned), but it was not to be. I found out that this "semi-dockable" window caused a messy error in Unity. It was eventually considered to be "by design" that the pop-up could be opened/docked, but not possible to check if it is already open/focused or docked (so I could close it). As such, it wouldn't be fixed so I could complete my tool's design. Even though I had already planned this stuff, I have had users explicitly ask me for all the functionality I've mentioned above independent of my original designs (please, see my thread if you don't believe me).

    Beyond even this, I am planning to make Snapcam / NAVIGATION STUDIO support open-world workflows. However don't even get me started on this one. No Unity tool yet supports open-worlds or 64-bit / floating origin situations. Because of this, I have a feeling Unity hasn't yet figured out its plans for handling open-world workflows in the editor yet. I have a unique game I'm working on that requires a technically-sound open-world world specifically present in the Unity Editor that might be right up whoever's alley who's working on that. Again -- just doing my part to push the engine in ways it has never thought to go. (*erhm* who said I wanted to BREAK Unity?? *cough* Nobody in their right mind would ever want to do that, now would they? :)

    Like UI Builder, I need my data in a central location to be flexible (i.e. supporting all kinds of data, like animation curves, textures and models, and not just ints and floats), quickly and easily modified (in-memory, not on disk), and simple to persist (i.e. quickly write those custom textures/models to a [compressed] file glob to be easily read back).

    All in all, my toolset has been a nightmare to create thanks to the Editor's current and past states. But I'm not bitter about it -- just as long as something is going to be done about these issues.

    For example:

    To handle the standard letter-keys, I seriously suggest a shortcut "keymap" on a "per workspace" level. That is, terrain tools can have their own workspace (like Blender has for "modeling" or "uvs" or "texture painting"). This means that, for whatever purpose you use Unity for, you have a particular workspace for that general functionality -- i.e. Modeling tools have their own space (i.e. Archimatix modeling, Probuilder/Polybrush modeling). Scene/gameobject manipulation has its own space (i.e. SceneObject Workspace), and anyone designing tools for these workspaces (i.e. NAVIGATION STUDIO's "View Navigator" would work in the workspace labeled "SceneObject" and all of its shortcuts would be used there. If Snapcam registers shortcuts for functionality that already exists, the user is able to quickly exchange / toggle Snapcam's shortcuts to override the standard SceneObject workspace's conflicting shortcuts, and then toggle back to the standard ones (perhaps by activating a ViewNavigator tool via a temporary shortcut -- i.e. tap the V key and hold it for less than 2 seconds to switch to the View Navigator tool -- otherwise, use the default Vertex-Snapping feature in the standard object-placement mode). Shortcuts should be as flexible as in-game controls. With proper options for repetition, holding, releasing, tapping, combination, sequence, timing, etc. -- on any device -- to allow flexible event execution! Right now, this is very difficult, and is very narrow-minded when it comes to any future VR and AR tool possibilities! -- Trust me, I've got some in mind!

    Just my two-cents.

    Let me know if you have any questions or if you or anyone wants to talk to me about this stuff.
    I have a feeling I can help in ways you guys never thought possible. :)


    And thanks for that! -- I had no idea! Very cool to know! :)