Search Unity

  1. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Make tool based on EditorVR to use in Play mode?

Discussion in 'EditorXR' started by lauraaa, Jan 4, 2017.

  1. lauraaa

    lauraaa

    Joined:
    Dec 30, 2016
    Posts:
    16
    Hi Unity Lab, thank you for making EditorVR and make it open API for people who are interested in making tools!

    I have a quick question about making tools based on EditorVR: can it also run in Play Mode or just through Window / EditorVR (editing mode)? For example, if I want to create manipulating tools (locomotion, scaling, changing exposed attributes, etc) for runtime, like the tools being used in Tiltbrush, is it possible to build one based on EditorVR? Would you recommend that?

    Thanks again for the release and looking forward to hear your thoughts!
     
    aaaavvvv likes this.
  2. amirebrahimi_unity

    amirebrahimi_unity

    Joined:
    Aug 12, 2015
    Posts:
    400
    Thanks for the feedback! We love that you would like to use EditorVR in play mode because it validates what we were thinking when we started designing the system.

    Since the question has been asked a few times, it's now added to the FAQ, but I'll copy what I posted here as well for convenience.

    The one other side of your question, which I think you may be leaning towards is using EditorVR tools while you are running your experience in the Editor and being able to switch back/forth between them. There's a lot more work to that as we'd have to swap over to an EditorVR context when you've paused your simulation and switch rendering, too.

    Does EditorVR work in Play Mode?

    No, not currently although it has been designed to be able to do that. We're hoping to provide a run-time stub, so that your tools can go over to play mode, too.

    Can I use my EditorVR tools in my app/game/experience?
    Yes, however, for now you'll need to wire up your own services for the interfaces that are used in your Tool. Take a look at ConnectInterfaces for reference.
     
  3. lauraaa

    lauraaa

    Joined:
    Dec 30, 2016
    Posts:
    16
    Ahh I see! Thank you so much for the explanations! Looking forward to the development!
     
  4. grimunk

    grimunk

    Joined:
    Oct 3, 2014
    Posts:
    272
    We have a feature in Scene Fusion that allows edits to continue in play mode. It's a bit different than what you are asking for in that a second machine/person has to make the edits on their instance of Unity, and you'll be able to see them while in play mode.
     
  5. codesrc

    codesrc

    Joined:
    Jul 24, 2017
    Posts:
    6
    I am working on stripping out the menus and workspaces, along with the selection lists and other UI to run on its own. Problems I have currently:

    1) This is a broken link: "Take a look at ConnectInterfaces for reference."
    2) I'm trying to figure out where to cut away the interfaces from the Unity Editor, so I can use the Editor VR classes as an application framework. Haven't figured that out yet.
    3) Trying to figure out what code and unity objects need to be linked together to make the Raycast hit the button over your controller to pop up the MainMenu and the RadialMenu. Currently unsure what that object is called. I found the graphic for it, but that's about all so far.
    4) If I can get as far as opening up a workspace like the Heirarchy view and populating it with test data, I'll be in a pretty good spot.

    I've reverse engineered all the scripts in Enterprise Architect, but that doesn't give me a very good picture of how scripts are tied to elements in Unity itself. It strikes me that the dev team that created this code might be able to create a short walkthrough of the code that can explain the basics of which bits are connected to what, with a lot less effort that its going to take me to reverse engineer it... I'm coming at this problem with a couple of decades in software development, but not a lot of time in Unity. Can you give me any hints or pointers about how I might want to progress here?
     
    rgjones likes this.
  6. rgjones

    rgjones

    Joined:
    Jan 23, 2017
    Posts:
    19
  7. amirebrahimi_unity

    amirebrahimi_unity

    Joined:
    Aug 12, 2015
    Posts:
    400
    Is all of this, so that you can use EditorVR instances at run-time (i.e. in play mode)?

    1) This is where it is now: https://github.com/Unity-Technologies/EditorVR/blob/development/Scripts/Core/EditorVR.Interfaces.cs
    but it has been converted from a single function into distributed interface connections via IInterfaceConnector (See AttachInterfaceConnectors)
    2) Well, you'd only have to hook up the interfaces to your own implementations for the interfaces that you make use of in your tools.
    3) Take a look at https://github.com/Unity-Technologi...tipleRayInputModule/MultipleRayInputModule.cs

    Regarding your last comment - you're looking under the hood and that is why the source is there, so that you can do with it as you wish, but it is also marked internal and not part of the public API. So it's subject to change and not something that we are promoting and educating about currently.

    If you share more about what you are looking to do, then I or members of the EditorVR team can try and point you in the right direction.
     
    aaaavvvv likes this.
  8. amirebrahimi_unity

    amirebrahimi_unity

    Joined:
    Aug 12, 2015
    Posts:
    400
    Not currently being worked on, but is still something that I'd personally like to see.
     
  9. codesrc

    codesrc

    Joined:
    Jul 24, 2017
    Posts:
    6
    What I'm trying to do, is clip myself an application framework out of EditorVR and use it as the basis for my own applications - like windows for AV/AR.

    So, the first thing I've done is remove all the compiler pragmas that require UNITY_EDITORVR or UNITY_EDITOR. My code is still operational in Unity Editor after that. In one branch I've been trying to get it started from a few GameObjects at the root, and the way I've gone about that is to replace the InitializeOnLoad calls with constructor methods I can call from awake(). In a second branch, I've been trying to wrap my head around building workspaces, tearing the workspaces you have by default apart from the Unity editor, and getting access to the other classes like the lists and text boxes.

    One spots that I'm having difficulty getting my head around is camera management, which seems to be going on in VRView. But I still haven't figured out how to get camera initialization to work properly for stand alone mode.

    I'm currently enrolled in Upload's VR Master's course in L.A. Next week I get to spend the whole week on independent study, so I'll basically be after this problem full time, and if I make some decent progress I might be able to take it further after that.

    With regards to the three points above:
    1) What is the best branch to work with? Siggraph looks like it might be ahead of the curve? Do any of these work with 2017? Is development the one to use? My changes so far have all been pretty throw away and I can re-implement from scratch to get a clean approach. But which branch should I go after? Once I'm done it may not merge back in so easily...

    2) Yes and No. I'm trying to get all the bits, the ray cast stuff, the radial menu, main menu, and all the workspaces, teleportation system, the lot, because it really does look like a fantastic code base, well designed, delicate and powerful. I've got an application that needs one and based on its cpu demands, shouldn't run at the same time as unity, and nor should it require its users have unity installed. So the surgery I need to do might be a bit deeper...

    3) Why should I be looking at that? What features of that need closer attention?

    I hope this message gives you a bit more of idea about what I'm doing. I'm very excited to hear back from you guys about this, especially because its a challenge working through this independently.

    -Mark
     
  10. De-Panther

    De-Panther

    Joined:
    Dec 27, 2009
    Posts:
    587
    If it's the "bits" that you want, I would suggest to try and use them separately from the EditorVR, instead of trying to port the EditorVR itself to work on play mode.
    Take each system separately(Workspaces, Radial Menu, Main Menu...) and make each one of them to work on play mode.
     
  11. codesrc

    codesrc

    Joined:
    Jul 24, 2017
    Posts:
    6
    They don't separate out easily... i.e. there aren't just prefabs you can extract or copy. Everything is dynamically instantiated in code with a dependency injection system. The goal is to pull the framework itself out, and get all the toys at once. It will be necessary to decouple some of the parts or replace them, but they're not likely to come out one at a time, as far as I can tell.
     
  12. amirebrahimi_unity

    amirebrahimi_unity

    Joined:
    Aug 12, 2015
    Posts:
    400
    1. Yes, development would make the most sense. 2017 is on hold until a bug fix lands in the latest beta
    3. This was in response to your question about having the raycast hit the button you were hovering over. The whole module handles raycasting for the system and event callbacks for the purposes of UI.
     
  13. codesrc

    codesrc

    Joined:
    Jul 24, 2017
    Posts:
    6
    Cool. I'm getting pretty close in the release code. When I've got that handled I'll do a merge with the dev branch.
     
  14. wirelessdreamer

    wirelessdreamer

    Joined:
    Apr 13, 2016
    Posts:
    134
    I'm very interested in this as well, and am interested in testing and providing feed back once it is at that stage.
     
  15. aaaavvvv

    aaaavvvv

    Joined:
    Jan 3, 2017
    Posts:
    7
    I also am about to dig into this. @amirebrahimi_unity has any of this been integrated into the current dev branch?