@neil_devine -- I've got a couple of questions about window context: I know this seems to be unrelated to overlays at first, but I've noticed that when moving the SceneView camera, and wanting to slow it down, I cannot speed-up or slow-down the camera movement (in Flythrough mode) with the mousewheel when I'm outside of the scene view (i.e. hovering over the Inspector). I'm sure this was "not a bug", but a "feature" in the past -- but I think that kind of thinking must change if Unity is going to be successful. It is particularly painful when rotating the camera (while moving with WASD, for example), and wanting to slow the camera down at the same time. The camera itself, for example, cannot be overridden or remade (or repurposed for 2d-like behavior, for example) in editor tools because it is a multi-context utility (i.e. Flythrough mode has special input mechanisms that aren't consistent with the rest of Unity's controls or workspaces -- and acts more as a "Tool" than a UI feature). I'll call this multi-context 'utility' an "MCU" in the future, as these are very prevalent (and unorganized) in Unity, and there are no standards for handling these as it stands (much less the terminology to even talk about them). This leads me to my questions: How are you planning on handling multi-context 'utilities' (such as shortcuts) for Overlays in the future? Are shortcuts (such as the mousewheel in the situation mentioned above) going to stay the same (global) context, or are they going to be window-based? If the answer to 2 is window-based, can there _please_ PLEASE be an option to tie them to a Toolbox, Toolbar, and also Workspace context instead? (I ask 3 because trying to execute more "global" events -- such as when making artist tools -- gives me so much trouble when needing to handle keys/mouse/etc. input that only needs to happen when a specific tool or mode is active -- especially when that tool or mode is NOT localized to any particular UI element or window.) This is also a problem with more global "events" in the Editor when scripting. For example, I've spent probably 3 years trying to find documentation on how to execute a more global "event" structure in Unity (without using UnityEvents) for editor code that is not tied to an EditorWindow, CustomEditor, or really any local UI "structure" whatsoever. -- I FINALLY figured out how to do this just today in fact, without any reflection or special tools in fact. However, this is kind of fundamental to more "freeform" artist tooling that needs to be untethered from the UI, including the window context. To further my point above: Using Snapcam as an example of a tool that has a mostly "context-less" approach, I have stumbled into issue after issue with referencing between systems that do not use the inspector (or scriptable objects) to store their data / operations. I imagine this to be an issue for _many_ different kinds of tools that don't work directly in the scene view, leading to heavy issues with keyboard shortcuts and other "shortcut-driven" workflows that swap context at will -- including interdependent tools / windows (such as the SnapNavigator and the GridNavigator -- which required an architectural change under the hood simply because data didn't exist for the window when the window was closed, as the EditorWindow container was being stored away for reuse later). As a further example of the "window" situation "getting in the way" of tool development -- I had a user try to use the GridNavigator window (which can act in dual functionality as a temporary popup -- and as a window that can be docked as well so the user can have a "grid" view of his "snaps" -- rather than the standard "slider" approach in the SnapNavigator window) Both windows, respectively, can be hidden at the press of the "S" and "G" keys, but when hiding the Grid Navigator window that is "docked" threw an error at one time (and the bug in Unity was never resolved). All in all, my current problems are "solved" at the moment, but I heavily urge you to consider my feedback in how you are engineering things. I am one of those users that "push" your technology to their limits, and I do crazy, unexpected things with your UI -- and the solutions I've proposed previously (i.e. the Workspace, Toolbox, Tool contexts rather than _just_ the SceneView and EditorWindow / CustomEditor contexts) should be heavily considered, as then truly Multi-Context Toolsets (i.e. like Snapcam) will become more manageable -- but more importantly, better (easier to use and more flexible) artist / environment / gameplay / bookkeeping and utility tools can be (more easily) created -- and these types of (very badly needed) tools will suddenly become more prevalent than just another broken (too complex to maintain -- and eventually deprecated) Editor Tool or super-tiny custom editor for x, y, or z specific use-cases. With a more "MCU" approach to standardizing tools and workspaces (alongside the Workspace/Toolbox/Toolbar contexts, which keep shortcuts/gizmos from being used _everywhere_ at any point in time, while also remaining extremely flexible to user-customizations and usage scenarios), tools will last much longer (which is better for both developers and the AS, as well as the users) -- and be 'less-broken' after updating Unity too (which is just better for everyone). Hopefully this feedback was easy enough to understand -- and productive as well.