We are excited to announce that as of today, EditorXR 0.4.12-preview is available through the Unity Package Manager! Simply open the Package Manager UI, enable preview packages, click the “+” button in and type `com.unity.editorxr` into text field after clicking “add package from git URL…” For Unity 2019.1 and 2019.2, you will need to edit the `Packages/manifest.json` file manually and add “com.unity.editorxr”: “0.4.12-preview”. Not much may appear to change on the surface compared to 0.3. But, under the hood we have many notable changes that should make your lives easier. Most importantly, EditorXR now uses the XR Tools Module Loader, which was introduced along with Unity MARS. Using the Module Loader helps us further decouple systems within EditorXR, and makes it easier to integrate EditorXR with MARS and other future packages which might adopt the Module Loader and Functionality Injection architecture. The APIs for developing Tools and Workspaces are largely untouched, but we have replaced or removed a good amount of "plumbing" code which is now handled by the Module Loader package. We have also removed the third-party Nition UnityOctree library and replaced it with the new Spatial Hash Module which serves the same purpose, but with a more flexible API which, among other things, supports operations on lists of objects, which is a much more efficient way to handle our use case of adding and potentially updating every object in a scene. This version still relies on the old Input Prototype, which is embedded in the EditorXR package and still relies on the legacy input system. It will prompt you to update your input bindings on first import as it always has. We are currently in the process of updating EditorXR to use the new input system, and expect to remove this code in a future release. Along with the input refactor, we will be splitting EditorXR into a set of packages which we call the Runtime Authoring Framework. We have received feedback, both from our community of users and from internal teams, that while EditorXR is a great way to get to good fast, adding support for new input devices or build targets can be a challenge, and adding or replacing whole systems like scene picking or haptics should be simplified. Our goal is to provide a modular and extensible system, and the changes we are making will help accomplish this. The solution we have chosen is to create or integrate individual packages that solve these simple problems in isolation (like scene picking or tool management), replacing EditorXR code with those as they come online. The XR Interaction Toolkit released last year handles ray-based interaction for both uGUI canvases, as well as interactable scene objects. We are already in the process of replacing EditorXR's MultipleRayInputModule with the equivalent module from the XR Interaction toolkit, and updating our `BaseHandle` to inherit from `BaseInteractable`. This lets us delete some code in EditorXR, and take a step toward unifying editing or authoring systems with gameplay systems. For example, if your app has controller models and a ray pointer, we should be able to just use that to drive EditorXR interactions, rather than having EditorXR always create its own during setup. This is our first update in a while, which is a little out of the ordinary. We have always developed EditorXR in the open on our public GitHub repository, in the spirit of being open-source and experimental. We briefly switched to a private repository as we prepared to make EditorXR available through the package manager. Going forward, we will continue to do PRs and push changes to our public GitHub repository. That's all, folks. Enjoy!