Search Unity

IShortcutToolContext -- Can I use it, pretty please??

Discussion in 'UI Toolkit' started by awesomedata, Jul 28, 2019.

  1. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    As explained, I really want to use IShortcutToolContext in my shortcuts (for continuous interactive scene control via my own (user-reassignable) shortcuts.)

    This interface is used in the flythrough mode in the scene camera controller. I want to use it in other tools.


    In 2019.1, it's currently not possible to add any truly interactive scene controls (using the new ShortcutManagement system) without this -- all because this interface was deemed "internal" only for some reason.

    There are SO MANY great tools that can be made by users like me with access to this interface (such as completely custom scene camera controllers!) -- And it would bring money to Unity on the Asset Store too, so it's win-win.
    So why am I not allowed to implement my own camera orbit behavior -- or even my own gradual color tweak tool for modifying a gameobject's color while holding a key for R, G, or B?
    These are simple tool ideas, but much better stuff could be made if people had access to this when designing their games!!

    Lots of cool tools will never be possible with the new shortcut management system policing our access to things like this. So please open this context up (and add more!) so you can democratize this type of stuff too, @Unity!
     
    Ziflin likes this.
  2. leo-carneiro

    leo-carneiro

    Unity Technologies

    Joined:
    Sep 1, 2011
    Posts:
    49
    Hi,

    Glad to hear that you are looking into using the new Shortcut Manager APIs.

    We were expecting that with the APIs we have exposed so far, it would indeed be possible to build interactive scene controls by using the SceneView as the ShortcutContext.

    Can you expand a bit more on what exactly are you trying to achieve that you are not able to do with current public API?

    Cheers!
     
    Lars-Steenhoff likes this.
  3. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    I, for example, have an asset that lets me drive the scene camera however I wish in older Unity versions. I had to do some hacks to get this working, but it worked pretty decently. I had my own orbit and zoom that worked way better than Unity's default scene camera controller. However, moving the scene camera in real-time using the new shortcut management system in 2019.x is impossible due to having to jump through hoops to define a proper tool when I simply want a key to respond and execute a script until I tell it to stop by letting go of the key.

    In tool examples like the RGB shifting in real-time, I don't necessarily want this to be tied to an editor window. In the case of a scene camera, I don't necessarily want this to be tied to a particular editor window. For example, maybe I want to hold control and switch to an entirely different method of navigating my scene -- i.e. I might only want my orthographic scene camera to shift by exactly 1 screen at a time with a slight delay as long as I am holding the control key and holding my arrow keys.

    That same example could apply to moving the selected gameobject along a grid, with a scene raycast to determine the height of the next grid cell before shifting it, with this method being repeated as long as an arrow key is being pressed in the editor. This is nothing but a quality of life improvement tool for navigating a scene for a particular game style, but it can make life amazing for the person using it for a grid-based game.

    By no means is this limited to scene camera/gameobject tools though. The ability to play a physics simulation while holding a key and stopping at just the right moment upon releasing it could be another good use of realtime editor shortcuts. Doing this with a button or a dedicated tool simply wouldn't allow the sort of fine control one might need for lots of iterations through groups of gameobjects when decorating a scene with lots of physically placed objects via simulations. There are just some occasions in the editor when you simply need a constant keyboard press to execute a static method until the editor realizes you've stopped pressing it.

    Thus far, I've only mentioned the scene view, but there are also times when I want to essentially program the equivalent of a macro too. For example, I might want to press and hold a single key to temporarily display an editor window (i.e the Game View) and then hide it when I release the key. I actually do this with Snapcam's windows, which proved to be problematic when trying to use a pop-up window that should have also been dockable (the Grid Navigator), but broke Unity when trying to do that particular thing. The problem was, some customers of mine liked it as a quick popup, but others liked it docked. Some wanted to be able to do both with the same physical window. I had to do some fancy reflection magic just to hide and show my suite of windows via shortcuts that could be accessed globally in previous versions of Unity, so I was excited to see a proper shortcut system in the works. However, I can't be tied down to the scene view as my context because all of my tools -- and their editor windows -- were designed to get the heck out of your way when you no longer needed them. This means an invisible editor window that handles logic like shortcuts. This design is impossible using the scene view context alone, since some of my tools I've planned have nothing to do with the scene view and act as their own sort of management systems that work across multiple window contexts. What if the user closes the scene view window? Ultimately, it's too hacky to do things as I did pre-2019. I can currently not make these according to my design without the system being unwieldy again.

    IShortcutToolContext seems like it would be a great fit for game-specific quality of life tools as much as it would be a great fit for tools that modify the editor UI itself, such as me wanting to override the sleek new editor camera controller to make a version that orbits and moves exactly how _I_ want it to for editing _my_ specific game.

    To put such essential UX stuff out of the hands of advanced users capable of understanding and controlling it should be a crime in my personal opinion. Unity has been too unwieldy for much too long for this to still be okay with you guys. :(

    Please at least let us use or at least override stuff like the above-mentioned user-experience elements (such as this interface or the scene camera controller.) This would be so much appreciated by so many people.
     
  4. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    So how might one use the SceneView as the ShortcutContext? -- As far as I'm aware, there's no way to use it as the context.


    Also, when I use something like:

    Code (CSharp):
    1. [Shortcut("MoveMode", null, KeyCode.A)]
    2.     static void MoveView ()
    3.     {
    4. //etc.
    5. }
    This executes only at the repeat rate of the keyboard.

    This is a problem. It doesn't repeat instantly (like using WASD in Flymode while holding RMB). I need the fluidity like that in my shortcut. In addition to this, overriding Mouse button behaviors in the scene context would be incredibly useful when you want to perform a different function than what the SceneView's SimpleCameraController enables. The "KeyCode.Mouse1" and others don't seem to work (and throw errors) in 2019.2 with using a null context. This might have something to do with the null context itself though, so being able to use mouse shortcuts in the scene view would be nice.


    Much of this I want to use to develop innovative tools and interfaces for UX purposes, but I can't do this without _at least_ this level of control.

    Being able to modify the current SceneView camera controller by overriding parts of it would be a first step (maybe make it a package??), but it wouldn't be general enough for some editor tools I want to make...
     
  5. teck_unity

    teck_unity

    Unity Technologies

    Joined:
    Jan 2, 2019
    Posts:
    4
    Code (CSharp):
    1.         [ClutchShortcut("MyShortcut", typeof(SceneView), KeyCode.C)]
    2.         static void MyShortcutMethod(ShortcutArguments args)
    3.         {
    4.             // can check args.stage here;
    5.             // ShortcutStage.Begin, ShortcutStage.End
    6.             // https://docs.unity3d.com/ScriptReference/ShortcutManagement.ShortcutStage.html
    7.         }
    The ClutchShortcut attribute allows you to get your keydown and keyup.
    (https://docs.unity3d.com/ScriptReference/ShortcutManagement.ClutchShortcutAttribute.html)
     
  6. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    Thanks -- I've read the docs.

    What I _really_ want to know is how to overcome the keyboard repeat rate of something like KeyCode.C or KeyCode.A so I can use those shortcuts repeatedly in the SceneView context.

    How would I accomplish this?


    Also, I admit it never occurred to me to use typeof(SceneView) for the context though (I tried using "SceneView" directly -- maybe this should be mentioned in the docs?), so thanks for that at least.

    Sadly, it doesn't solve my problem.
     
  7. teck_unity

    teck_unity

    Unity Technologies

    Joined:
    Jan 2, 2019
    Posts:
    4
    Apologies, just saw this. At least for your use case, you'd simply need to set a boolean (true on down, false on up), and do the actual handling in the OnToolGUI method.

    In one of my own tools, I'm doing something similar to:
    Code (CSharp):
    1.         [ClutchShortcut("PivotTool/Adjust Pivot", typeof(SceneView), KeyCode.A)]
    2.         static void AdjustPivot(ShortcutArguments args)
    3.         {
    4.             // instance is a private static reference to the current tool which I assign in OnActiveToolChange
    5.             if (!EditorTools.IsActiveTool(instance))
    6.             {
    7.                 return;
    8.             }
    9.             instance.adjustPivot = args.stage == ShortcutStage.Begin;
    10.         }
    11.  
    12.         public override void OnToolGUI(EditorWindow window)
    13.         {
    14.                         if (adjustPivot)
    15.                         {
    16.                                 // do adjust pivot stuff
    17.                         }
    18.                         else
    19.                         {
    20.                                 // do other stuff
    21.                         }
    22.         }
     
  8. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    Thanks for getting back to me -- I know you are busy.


    Err... um.... Thanks, but this was exactly what I'd hoped I'd not have to do...

    There are times when the "ClutchShortcut" doesn't "Clutch" the key release -- This occurs especially when swapping focus with another area or interface (i.e. like when opening/using interdependent windows/tools like Navigation Studio does) -- which then causes the bool to miss getting set back to false, when the context changes to some other window, which thus prevents further (proper) use of the tool.

    Why not just offer an attribute or something that would let the shortcut work this way _without_ jumping through hoops like these? -- This method seems really clunky and error-prone, and is probably hard to understand for most people anyway...


    I know I'm not the only one who needs to make my own interactive toolsets...
     
    Rowlan and MostHated like this.
  9. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    Any info in response to the issue with swapping focus and/or windows losing the logic of the Clutch stuff mentioned in my previous post?
    I foresee having to manually create a "reset keydown" function for any key I want to have such "repeating" behavior...

    This could get especially troublesome with art/design tools...
     
    MostHated likes this.
  10. Ziflin

    Ziflin

    Joined:
    Mar 12, 2013
    Posts:
    132
    I just ran into this annoyance as well for our custom scene view fly mode. [ClutchShortcut] does not allow binding to the Shift keys, so it appears we'll have to register 4 additional shortcuts just to have a "Fast Speed" version of our camera movement.

    CameraFlyModeContext.cs uses a [ReserveModifiers(ShortcutModifiers.Shift)] attribute that isn't exposed (though neither is IShortcutToolContext) to avoid having to do this and have 4 other shortcuts registered. It's not exactly clear how that is applied to increase movement speed however.

    Is there some better way to handle this or can we get something exposed that allows us to write camera custom scene view controls as easily as it apparently already done internally?

    Originally we were using an duringSceneGui event to handle the KeyDown/Up events, but as was mentioned it is easy to not receive KeyUp events. One case is if the key is release after hitting a debug breakpoint. (This appears to be solved for CluchShortcut however?)

    Actually, I just tested this, and it does not work properly. Using two [ClutchShortcut] attributes one for say W and another for Shift+W does not work as desired when pressing the shift key *after* holding down W to move faster.

    This makes the [ClutchShortcut] useless for us and we will have to go back to hijacking the KeyDown/Up events in the 'duringSceneGUI' callback.. Which has the issues mentioned above and can't be rebound using the Shortcut manager.

    If the [ClutchShortcut] attribute could have a "reserveModifiers" parameter and add a 'modifiers' field to ShortcutArguments, then the callback could check which reserved modifiers were down and everything with a single [ClutchShortcut] that would be great. This would then only expose the single main key as the one being bound in the shortcut manager and it could also prevent the user from binding to a keypress that included any of the reserved modifiers.
     
    awesomedata likes this.
  11. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    @teck_unity and @leo-carneiro

    Also @willgoldstone -- Do you have any idea about this?

    As you can see from the posts above, this is not sufficient for a full-scale SceneCameraController toolset -- especially if one wants to leverage the Shortcut Manager for these kinds of tools (and override the SceneView camera controller situation).
    What would be necessary to redo this system? -- I need access to keyboard repeat-rate for other situations as well (for standard keys like 'A' -- especially when the new Overlay system comes into being and I have different workspaces and toolboxes with different toolsets, each using identical shortcut keys depending on the workspace and toolbox/toolbar context -- not to mention the tools themselves.)

    Clearly I am not the only one developing such tools.


    Please respond. -- It has been over a year since these posts were made, and this workflow is still nightmarish.
     
  12. Ziflin

    Ziflin

    Joined:
    Mar 12, 2013
    Posts:
    132
    We are also working on a custom terrain/floor editor which has multiple modes and each mode may re-use (several of) the same hotkeys in different ways. I'm unclear how it would be possible to use the new Shortcut Manager to provide a different context for each of these modes that should accept key-presses from inside the Scene View?
     
    awesomedata likes this.
  13. Ruchir

    Ruchir

    Joined:
    May 26, 2015
    Posts:
    934
    Any workaround for this till now?
     
  14. noobler

    noobler

    Joined:
    Oct 22, 2014
    Posts:
    11
    > that should accept key-presses from inside the Scene View?

    I ended up creating my own shortcut system for a SceneView based level editor i'm working on. it simply attaches event handlers to the root of the scene view, checks key presses against my own custom shortcut definitions, and stops the ones i'm interested in from propagating down.. e.g. my tool is 2D and uses layers, so pressing '2' toggles layer 2 instead of switching out of 2D mode, and it does so without remapping my users' unity shortcuts.

    > causes the bool to miss getting set back to false, when the context changes to some other window, which thus prevents further (proper) use of the tool.

    You could easily attach the same event handlers to every EditorWindow in the current layout and it should handle cases switching between them.

    I'm not saying this is elegant or advocating for the approach, and tbh i feel a little gross having done this rather than using the built in API, but it's simple and it works.
     
    Ruchir likes this.
  15. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    I was just curious if anyone else has managed to achieve a custom SceneView camera fly mode and would be willing to share their reflection-magic code for it?

    I've been waiting for quite some time, but apparently Unity won't let us actually have native access to this part of the application.
     
  16. Ziflin

    Ziflin

    Joined:
    Mar 12, 2013
    Posts:
    132
    I can try to write up something for the one I made next week. Basically we're working on a iso/top-down game and want to use the W/A/S/D keys to always pan around the scene similar to what would occur in-game but with a bit more freedom for editing purposes.

    There are two issues that require Unity to resolve:
    * Because of this (thread) issue and things I've mentioned above, there is no way to expose custom shortcut keys that need to use Modifier keys.
    * Despite my best efforts, the scene view refuses to remain V-Synced all the time and something related to that is causing small stutters in motion occasionally. I have a hotkey to 'reset' the camera angle/distance and that also uses reflection to reset the vync state. Note that the "Game window / Aspect / VSync (Game View Only)" settings does control this for the scene view until something breaks with it and it has to be reset w/ the hotkey. When it's properly vsync'd the motion is extremely smooth.

    It's still quite frustrating that Unity can't get these issues resolved as our custom camera control has been extremely useful. I do not understand why they insist on implementing theirs with internal attributes and non-exposed functionality.

    In a similar vein, please expose the internal methods used to raycast into the scene view scene against *visible* geometry.
     
    awesomedata and Ruchir like this.
  17. Ruchir

    Ruchir

    Joined:
    May 26, 2015
    Posts:
    934
    Yes please :(
     
    awesomedata likes this.
  18. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    Yep. This is the problem I've had forever with Snapcam. I struggled with this constantly thinking it was something to do with the timing of grabbing my input or the internal order of events. Believe it or not, prior to 2018.1 or 2 (don't remember which, and only for a few versions), the motion was vsynced properly at some point, and movement was buttery smooth. This is the only reason I still have this code in Snapcam -- I realized it wasn't a _me_ issue. :/

    I wanted custom Shortcuts for Snapcam to do this, until I realized this simply wasn't possible in Unity.


    ---

    @neil_devine -- Would you mind looking into this, or passing it on to the proper teams? (If you're around?)

    @teck_unity -- Perhaps you could pass this use-case around too?

    Many people (including me) would like to be able to modify the standard SceneCameraController and write our own versions (possibly for different modes).

    Unrelated, but... This too. D:
     
  19. neil_devine

    neil_devine

    Unity Technologies

    Joined:
    Apr 8, 2018
    Posts:
    42
    @kaarrrllll could shed some light on some of this.

    Potential workaround
    * Edit-> Preferences->General->Interaction Mode: try monitor refresh rate

    Are you reflecting your way to EnableVSync ?
     
  20. Ziflin

    Ziflin

    Joined:
    Mar 12, 2013
    Posts:
    132
    @neil_devine Mine's been set like that (using Monitor Refresh Rate) for a while now, but it does nothing to fix the issue. It works for a short while, and then it starts stuttering again until I use reflection to reset the vsync.

    I'm using the following:

    Code (CSharp):
    1. static void EnableVSync( SceneView sceneView )
    2. {
    3.     // Initialize EnableVSync() delegate.
    4.     // Get 'm_Parent' field:
    5.     var bindFlags = BindingFlags.NonPublic | BindingFlags.Instance;
    6.     var fieldInfo = sceneView.GetType().GetField( "m_Parent", bindFlags );
    7.     var parent = fieldInfo.GetValue( sceneView );
    8.  
    9.     // void EnableVSync(bool value);
    10.     var methodInfo = parent.GetType().GetMethod( "EnableVSync", bindFlags );
    11.     var enableVSync = ( EnableVSyncDelegate )methodInfo.CreateDelegate(
    12.         typeof( EnableVSyncDelegate ), parent );
    13.  
    14.     // Call the EnableVSync() method.
    15.     enableVSync?.Invoke( true );
    16. }
     
  21. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    Thanks for taking a moment here to give us anything at all -- oftentimes threads like this go unanswered.

    In my case, I was mainly curious about the ability to write my own SceneCameraController as well as other things that require key combinations or other repeated keyboard events -- especially when dealing with Shortcut conflicts like those with the SceneCameraController (when writing a custom one). I want this mostly in the context of scene-based tooling -- but also to be used with tools that can be activated with global keyboard and mouse-like commands in the editor too (e.g. stylus or VR controllers that take continuous input/positioning for realtime updates). My main interest, however, is with conflicting shortcuts across multiple tools / toolboxes that serve very different functions (but often need to be used at the same time -- such as custom camera control alongside controlling the SceneView camera controller).

    Is something like this looking to be part of the new Workspace system?

    In other words, could I make a Workspace that allows me to access my Terrain tooling and Scene Navigation tooling (two separate Tool systems) simultaneously in the same Toolbox/Workspace -- even if they have conflicting shortcuts? (For example, I may want some global tools available to me from other Workspaces (or even from a proper "Group" of these Workspaces (i.e. everything in a "Worldbuilding" Workspace group would have a "G" key to activate the "Grid" option, but would also enable Snapcam's ViewNavigator to allow a better/easier way to flythrough the Scene than the standard SceneViewCameraController -- so I might have a 2D Worldbuilding and 3D Worldbuilding flythrough mode, each with their own kinds of grids, in the same (or very similar) Workspaces -- sort of like Edit mode is available in the UV Editing mode of Blender -- but I'd prefer it to be a bit more modularized, such as like the Tools/Toolbar > Toolbox > Workspaces > Workspace "Groups" concepts I mentioned in the past.

    Ideally, global shortcuts in certain Workspaces (and across particular EditorWindows, as long as they are open) let me open other EditorWindows (like in Snapcam using the "S" or "G" keys) or even activate tools in a particular Toolbox + Toolbar combo with a single global keypress -- such as "G" for grid in a 2D Terrain tooling Workspace. This would also support "G" to bring up my Grid Navigator window while either my Snap Navigator window is focused or if I am holding Shift, assuming there is a conflict with the "G" key with another tool (such as the 2D Terrain grid using "G" in this case) -- letting me basically define a failsafe shortcut for "G" in the Shortcut Manager (Shift+G) if a global single-key shortcut in the current Workspace (or group of Workspaces?) has a conflict with another Tool/Function in another Toolbar in the same Workspace (or Workspace Group), since they'd both exist in the same Toolbox/Overlay system / Workspaces and want the same global shortcuts in some cases, when just throwing different toolboxes back and forth across all sorts of kinds of Workspaces. For example, I can see Snapcam's ViewNavigator (custom SceneViewCameraController) being useful in Terrain modeling, painting, placing decorations -- or even modeling with Probuilder -- but all in different Workspaces and shortcut contexts. To make a few Toolbars and EditorWindows and a SceneViewCameraController work, I'd need a pretty flexible Workspace concept for my tooling, since it is multipurpose and works across many different workflows (and works with many different kind of Workspaces). An FBX-Exporter, for example, doesn't need its own Workspace either, so it could be included (as a convenience) in all relevant Workspace environments that might need it (and include its own "E" key shortcut in those to export the current Scene as an FBX), so in a way, Snapcam is similar to this in terms of needs for Shortcuts, as it could be used across many different sorts of workspaces.

    Is there anything in the pipeline to handle such scenarios?
    -- I'm not sure who to ask about this, but I'm definitely up for helping out with any testing or design concepts a system like this would need.
    My tool sits on the Asset Store pending an update, because I've been waiting for quite some time for a stable / flexible tooling ecosystem to support my global tooling needs and need greater flexibility in the Unity Editor in terms of how it handles Shortcut Management for situations like these. :(
     
  22. neil_devine

    neil_devine

    Unity Technologies

    Joined:
    Apr 8, 2018
    Posts:
    42
    @Ziflin I am sorry that was the only potential workaround I am aware of. I'll take it up to the team.
    @awesomedata At this point I can only say we are actively working towards what you describe, but cannot promise when that lands, just that parts will gradually come online. Your name is on our list of people to contact for feedback.
     
    awesomedata likes this.
  23. kaarrrllll

    kaarrrllll

    Unity Technologies

    Joined:
    Aug 24, 2017
    Posts:
    552
    I don't have answers for everything here, but I'll at least try to shed some light.

    > Because of this (thread) issue and things I've mentioned above, there is no way to expose custom shortcut keys that need to use Modifier keys.

    Yes, it's true that there are some limitations that are not easy to work around. In the case of writing a custom view tool, it's even worse because the camera navigation code is not implemented as an EditorTool. That is, we couldn't just allow you to override `Tool.View` like we do for Move, Rotate, Scale via EditorToolContext, because the navigation is part of the Scene View itself.

    > Despite my best efforts, the scene view refuses to remain V-Synced all the time and something related to that is causing small stutters in motion occasionally.

    Not sure why this would be the case. I don't see anything obvious in the SceneView code that is touching v-sync.

    > It's still quite frustrating that Unity can't get these issues resolved as our custom camera control has been extremely useful. I do not understand why they insist on implementing theirs with internal attributes and non-exposed functionality.

    I would agree. In this case it's not really that anyone is actively opposed to exposing this functionality, but rather it just wasn't a high enough priority to spend the time to refactor and expose the APIs necessary. As Neil already mentioned, people are working towards this now, although time-frame to delivery isn't determined.

    > In a similar vein, please expose the internal methods used to raycast into the scene view scene against *visible* geometry.

    Everything we use internally is exposed in the HandleUtility class.

    https://docs.unity3d.com/ScriptReference/HandleUtility.FindNearestVertex.html

    https://docs.unity3d.com/ScriptReference/HandleUtility.PlaceObject.html

    https://docs.unity3d.com/ScriptReference/HandleUtility.PickGameObject.html

    Unfortunately we don't have a general purpose "pick position and normal from any visible geometry". The closest would be a combination of `PickGameObject` and `PlaceGameObject`. The former picks anything visible, the latter only intersects physics enabled objects.
     
    awesomedata likes this.
  24. Ziflin

    Ziflin

    Joined:
    Mar 12, 2013
    Posts:
    132
    @kaarrrllll & @neil_devine - Thanks for the replies. I worked on game engines for roughly 15 years, and I know it's a pain having to stop and answer forum questions :).

    It's good to know there's some work being put into fixing some of the scene view limitations. Ideally everything that is done in it would be possible to do as plugins by users at some point as the built-in camera movement definitely does not work well in our case. The [ClutchShortcut] almost worked, but without modifier keys working we still have to hardcode our input for that and for sceneview tool-specific input which obviously has issues with custom use shortcuts.

    We also have several "modes" that our tool can be in and its possible that the same shortcut keys are re-used. So ideally that is something that would be possible, i.e. "Global" Tool Shortcuts in addition to "Mode" specific shortcuts (painting terrain blend weights vs. heights are different modes for example). (basically a shortcut context hierarchy)

    I brought this up during the discussion on the new shortcut manager, but if it's possible to do now with "contexts," I don't see how. The docs are sorely lacking on this still 2 years later.


    >> Not sure why this would be the case. I don't see anything obvious in the SceneView code that is touching v-sync.

    It seems to occur often when switching from "Maximized" scene view (Ctrl+Space) and then back to un-maximized. It stutters until I call EnableVSync() again.


    >> Everything we use internally is exposed in the HandleUtility class.

    Currently we are using HandleUtility.IntersectRayMesh() via reflection because it is unfortunately marked "internal".


    >> Unfortunately we don't have a general purpose "pick position and normal from any visible geometry". The closest would be a combination of `PickGameObject` and `PlaceGameObject`. The former picks anything visible, the latter only intersects physics enabled objects.

    Exposing HandleUtility.IntersectRayMesh() would allow us to write something similar to SceneViewMotion.RaycastWorld() (which is what middle-mouse click uses I believe) without having to use reflection. Here's a separate thread regarding that.

    Thanks for the help!
     
    awesomedata likes this.
  25. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    Sadly, we totally need something like this for level design tooling almost constantly. Even external tools that have built-in level-editing functionality that is hooked into Unity (like Houdini, for one) could easily benefit from this functionality. To have to create gameobjects/collision-geo to get this kind of info is pretty frustrating for scene tooling. :(
    Unreal is getting all the love from Houdini because of interface and UX issues in Unity for tooling just like the ones I've mentioned above -- yet Houdini is critical for enterprise-level world creation, mostly because of its ability to handle scene-based tooling inside Unity with such flexibility. :(

    @neil_devine:

    This was why I mentioned a Toolboxes concept -- Many different "modes" could be handled in the Tool itself, reusing those same shortcuts as a function of the more global "Workspace" concept that checks what "Toolboxes" it has available to it. This would then feed back into whether there are any conflicts in the "Tools" contained within. At this point, alternative fallback shortcuts could be specified (and saved) in these Workspaces, and end up being the norm should one run into Tools (and "Modes") that might be conflicting.

    I do think "Modes" should be a part of each "Tool" however, at least as far as Shortcuts go. This should be independent of functionality. These "Modes" should also have the ability to have "conditions" to change between them -- i.e. Workspaces with grids should let the "G" key turn the grid on/off always, rather than opening a GridNavigator window (using my Snapcam example) -- therefore these Toolboxes would have a "G" key shortcut be set as a "Priority" shortcut -- and set all others with lesser priority to "Shift+G" (unless specified by the other Tools, Modes, or Toolboxes themselves). The "G" would just be considered "default" by the "3D World Editor" or "2D Tileset Editor" Workspaces when "G" was set as a shortcut. Unless a particular override for "G" was specified in my tool (which has "G" as a global "Grid Navigator" shortcut that simply opens the Grid Navigator window in my tool's case), "G" would suddenly become "Shift+G" in the Workspace, mostly because the 2D Tileset and/or 3D World Navigation Workspaces set the "Shift+G" as the shortcut override for any conflicting tools. Each overridden (conflicting) shortcut could simply be manually be changed by the user in the Shortcut Manager for these Workspaces (+Toolboxes +Toolbars/Tools/Modes), and depending on the priority of the Shortcuts, they can be changed per-Workspace in a global manager for Shortcuts (not unlike our current Shortcut Manager -- just one for these Workspaces / Workspace Groups, and their children Toolboxes, Toolbars/Tools/Modes respectively.

    Yes. I've thought this all out, haha. :)

    Anyway -- I just wanted to say thank you (and everyone else involved) for your tireless efforts on this front. I know it isn't even _close_ to easy.. I'm hoping what I wrote above (in both posts here) makes a bit more sense than what I've shared in the past.

    Again. Thank you. :)
     
  26. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    How are things going on this front these days? -- I've received a few questions about this, and from what I've seen, some previously inaccessible API is now somewhat accessible.

    Is there any progress on being able to just press, say, a keyboard key and run a function independent of whatever else is open/selected (window-wise?) in order to create a better workflow for more "Workspace-like" sets of windows, tools, and gadgets?

    This is why we NEED some "Workspace" concept to handle general global (i.e. one-key) shortcuts for popping up and closing / toggling certain features / windows / toolbars while within a particular Workspace, each with its own shortcuts.


    Is this concept coming at some point in the next couple of years, @neil_devine?
    Just curious what API access and things we can now do or have to work with in 2022 and 2023 in terms of the Editor UI / UX and API associated with it.
     
  27. TomasKucinskas

    TomasKucinskas

    Unity Technologies

    Joined:
    Dec 20, 2017
    Posts:
    60
    Hello, awesomedata! I’ve tried replicating your problems by implementing custom scene view navigation and I think I understand your challenge here.

    In the editor, the shift key is not a smooth implementation too. The ReserveModifier attribute we see on CameraFlyModeContext only makes the shift key not end movement shortcut clutches and the processing of the camera speed itself is done in the updating code (Check for hardcoded shift key and increase speed if it is pressed).

    This thread does raise a point that we should consider exposing more of the ShortcutManager API. For cases like yours, I think ReserveModifier would be enough to be able to at least clone the functionality of scene view. I’ll look into it.
     
    awesomedata likes this.
  28. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    Thank you for looking into this.

    One of the biggest problems I have with the ShortcutManager API (in general) is the inability to define our own workspaces and set these as the active workspaces (with their own one-key shortcuts, etc.) -- Something like this would be an IMMENSE timesaver in level design (and other) workflows.

    Please, please, please look into this too -- As far as I understood it, a "Workspaces" concept was coming. However, if we don't get it in before Graph Tools Foundation becomes solidified under the hood, I doubt it will ever be implemented, as I am pretty sure the shortcuts' interface would be tied closely to the new form of GTF in some way (which I think isn't too great an idea from a custom tool developer standpoint, as flexibility with tool shortcuts should have a general Workspace and a specific Workspace context -- allowing the general workspace to interact with other tools, such as an animation or scripting editor, or even the scene camera shortcuts, if desired).

    While this somewhat defeats the purpose of a siloed modular functionality, in UX, there are some occasions where tools simply need to interact and overlap -- but this is the _exception_, and not the rule. Therefore a siloed Workspace should be the default, rather than mixing it in with the global editor shortcuts (which, in some cases, such as cases like mine, it would be better to just let us override that functionality entirely -- including the shortcuts!)


    Something to note:

    My overall goal is to have a multi-window suite of tools that are able to inform and play off of one another, whether or not the scene view is active. But the windows act mostly as temporary property panels (to tweak and modify something, then quickly close them with a toggle key), whereas most interactivity happens in the scene without said windows open or visible. Toolbars are handy, but they take up valuable screen real-estate, and property panels would be easier to toggle with a shortcut key or a button (say, in a VR editor environment -- which Unity was experimenting with at one time). This UI / UX workflow will come in handy in the future with Unity, as mobile applications are going to become more and more VR-friendly with the advent of the metaverse.

    Unity has a chance to step ahead in this area and address these technical problems before they are ever a problem, and then be ready for developers in their editor when this change to application development UI / UX workflows happens.

    Just something to think about...
     
    Ruchir likes this.
  29. TomasKucinskas

    TomasKucinskas

    Unity Technologies

    Joined:
    Dec 20, 2017
    Posts:
    60
    If you want to have shortcuts in one window change based upon the state of another window, you could look into shortcut tags. The idea with them is very similar to what you describe as a workspace. For example, you can define a bunch of global shortcuts tagged with "MyTool" tag and they will override any global shortcut while that tag is registered(Registering and unregistering is done by your code). You can also operate in the same fashion with shortcuts that have window context thus making your window have different shortcuts based on what tags are active.

    That API is mainly intended for enums but you can also use it with strings though I'd advise sticking with enums as to not get to an unmanageable amount of tags.

    You can review a code sample of it here:
    https://docs.unity3d.com/2022.2/Doc...utManagement.ShortcutManager.RegisterTag.html
     
  30. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    This is an interesting workaround, but what about realtime (i.e. hardware) keys and when two tools/addons/plugins use the same shortcut registration without knowing about one another? -- I can imagine this potentially being a problem with enums.

    That said -- I appreciate the information. Honestly, this is the first time I've ever heard of such a feature.

    I genuinely wonder why this stuff is so obscure to users? -- Seems like Editor Tooling would be a huge priority for Unity to expose new users to. Almost any game (or movie, or project) needs specialized tooling if it is even remotely creative. And with the popularity of the VR/XR space only increasing, tooling in that space is almost mandatory at this point. The way I see it is -- if we can't get the 2D tooling workflows solved, 3D interactive workspaces are going to be exponentially more difficult. Some things with the UX of certain custom tools, a panel with some properties just won't solve.

    A good example is how I implemented my Grid Navigator in Navigation Studio, versus how I implemented the Snap Navigator. The latter is more of a panel with properties. While a slider will indeed "snap" the camera to the indicated points, and it is quite handy/useful compared with a menu popup or a list of text, the Grid Navigator is by far the best implementation of the same functionality. While it too uses a "panel", that panel is meant to pop up temporarily and disappear once a user clicks on the thumbnail indicating the part of their scene they wish to go. The panel is then magically out of the way, and the user commences with their _actual_ work using other tools. All Snapcam / Navigation Studio (in this current context) exists for is for the next time we need to work on an area of our world. It should be a simple key on the keyboard to pop this up, then a click to go there. Done.

    One of my initial design ideas was to pop-out the Snap Navigator window when you press the "S" key and hide it when you press it again -- even if it was previously docked. I got all sorts of crazy errors doing this.


    In short, when my customers tried this -- yet I hadn't intended them to use it this way (since I too got an error) -- I got a bug report I had to backtrack and try to understand what they were doing. Turns out, one of my customers was trying to do exactly the thing I tried to enable but couldn't. So, it wasn't my bug -- it was a bug with Unity's feature set and its lack of support for unexpected (but sometimes necessary) usage and flexibility with windows and pop-ups.

    If I tried to "register" and "unregister" shortcuts with a setup like this, I imagine it would be nightmarish to implement and keep track of in general -- especially on a more complex tool than Snapcam. Not to mention how NOT obvious these workflows are to begin with. Shortcuts are basic functionality in almost any other program -- so keeping these obscured behind weirdly-hidden workflows just seems counterintuitive to me, and hurts Unity's reputation -- badly.

    --

    In short, for my little tool, I could probably manage a workspace similar to what you described. But I would seriously suggest opening this up to more users by making it a clearer feature, with lots more examples of use-cases. That is, unless there's something better in the pipeline?


    Also --

    I'm still not sure I understand what a shortcut "clutch" is, despite being very familiar with the terminology (at least API-wise). The usage of something like ReserveModifier, as you described for more realtime tools doesn't seem like it would allow me to hold, say "G", and roll my mousewheel to "Grow" or "Shrink" an object in the scene, for example. In some kinds of games, this would be invaluable for fast level design and iteration.

    I'm not sure I've seen anything like that ReserveModifier in Unity. Google turns up zilch. Nothing. Nada. I think, at some point, I remember this code in Unity's source -- but most people don't go digging in Unity's source code like I do. And even when I do, I can't say I always understand everything there without hours and hours of active debugging -- usually only to find out that what I need is locked away behind some "internal" keyword. Overriding the SceneView camera was just one of those instances where I thought it would be a no-brainer to allow users to modify the SceneView cam. Not everybody needs the same kind of camera control in the SceneView. A 2D Metroidvania, for example, would be a lot better with a camera that is snapped to the various "room" sizes, and uses the arrow keys (and perhaps a simple modifier) to tab between these equally-sized "rooms" in any direction. A right-click and a drag would be even better imo -- but that would not be possible without setting the camera to orthographic and having full control over its default controls.

    This is just camera work, but the shortcuts go along with this. Unity seems to try to silo everything, but UX, by default, is NOT a siloed operation. It has to be handled holistically. Shortcuts included. But even if the feature set is there, in SOME form, it is useless if nobody but the Unity developers themselves know about it. And often, it takes threads like this one to even get a little insight into these systems and intended workflows. I've had three people message me in the last month because they want to know more about these systems, and I only know about them because I needed them and dug into Unity's source. Even after all of that -- I still don't quite understand what a shortcut "clutch" means to the Unity engineers! -- And this inhibits my ability to understand Unity's source code without A LOT more time sunk into the shortcut system -- a system that should be intuitive to understand and accessible to anyone, tool engineer or just simple game developer, without the need to dig so deep. Most people just wouldn't do it. :(


    Sorry if it sounds like I'm complaining or if you already are aware of these issues, but people don't really have a voice at Unity these days, and it seems like some of these issues should be glaringly obvious -- yet they are never dealt with -- leading to A LOT of frustration amongst users such as myself.



    TL;DR:

    Just some feedback, whether you want to read it or can do anything about it or not.
    At the very least -- thank you for your help in clarifying some things. I do appreciate it.
     
  31. TomasKucinskas

    TomasKucinskas

    Unity Technologies

    Joined:
    Dec 20, 2017
    Posts:
    60
    You're right. Tag overuse of various tools can lead to shortcut clashes and the user would have to rebind shortcuts to something other than the asset developer had originally intended.

    As far as shortcut types go, normal shortcuts trigger only when the key or mouse button is down while clutch keys trigger twice (key/button down and key/button up). You can also have different behavior for each invocation by creating a shortcut method with ShortcutArguments class. For example in scene view, we use clutches to start and end drag actions.

    https://docs.unity3d.com/ScriptReference/ShortcutManagement.ShortcutArguments.html
    https://docs.unity3d.com/ScriptReference/ShortcutManagement.ClutchShortcutAttribute.html

    Since there are no code samples in API docs, take a look at this code:
    Code (CSharp):
    1.     static bool isDragging;
    2.  
    3.     [ClutchShortcut("My/Drag", KeyCode.Keypad0)]
    4.     static void MyDrag(ShortcutArguments args)
    5.     {
    6.         switch (args.stage)
    7.         {
    8.             case ShortcutStage.Begin:
    9.                 Debug.Log("Start Dragging");
    10.                 isDragging = true;
    11.                 break;
    12.             case ShortcutStage.End:
    13.                 Debug.Log("End Dragging");
    14.                 isDragging = false;
    15.                 break;
    16.         }
    17.     }
    The actual drag behaviour would have to be handled in some periodically running method though.
     
    awesomedata likes this.
  32. TomasKucinskas

    TomasKucinskas

    Unity Technologies

    Joined:
    Dec 20, 2017
    Posts:
    60
    @awesomedata Hi again! I just wanted to notify you that we've exposed the ReserveModifiers attribute. You will be able to use it next to Shortcut and ClutchShortcut attributes on methods that are intended to be shortcuts. Alongside that, we will have a documentation page that will describe a simple scenario of using the attribute to do some custom scene movements with boost ability. All of this should become available with 2023.1. Hope you'll like it.
     
    awesomedata likes this.
  33. awesomedata

    awesomedata

    Joined:
    Oct 8, 2014
    Posts:
    1,419
    Thanks for letting me know -- and for helping to do something about some of these glaring issues.

    I don't want to see Unity fail, but A LOT of people I know are moving to UE5 right now.
    And it is quite disconcerting to be the only one of my friends sticking with Unity... :(

    Unity's UI -- particularly where tools are concerned -- was the reason I chose the engine in the first place. But then I discovered nearly _everything_ else about it required me to open up a text editor. I came from Game Maker way back in the day, so I started Unity as an artist/animator -- and then I was forced to become a coder (just as I was in Game Maker when I wanted to go 3D). But one of the biggest promises Unity made to me was that the UI and UX was ultimately better, and that the performance and flexibility of the engine, especially when it comes to custom tools and workflows, was worth the effort and patience waiting for Unity to improve.

    But the decision-makers at Unity, especially where these things are concerned, haven't kept that promise to me (or to other users), and that is why we are leaving in droves as of late. Lots of long-time users such as myself are fed-up, and I'm sure the Unity devs such as yourself are probably a bit fed up with everyone's complaining. But all of our "complaining" is coming from a deeper issue. One with how users are seen and dealt with -- and that is out of your control for the most part. So please don't take this personally. I'm not sure what's going on at Unity these days, but thank you for being a breath of fresh air, my friend. It's sad to say, but right now, UE5 is looking very tempting to artists and other creatives. Much about it "just works" -- and if I had a dollar for everytime I have heard that exact phrasing from a fellow developer -- I'd probably own Unity right now.
     
    Last edited: Oct 25, 2022
    TomasKucinskas likes this.