Search Unity

Question MARS and XR Interaction Toolkit

Discussion in 'Unity MARS' started by scrant, Jul 29, 2020.

  1. scrant

    scrant

    Joined:
    Jun 1, 2017
    Posts:
    73
    Hi, so how do we configure MARS to work with XRIT and vice versa? XRIT lays down a bunch of AR Session and Session Origin nodes and components but doesn't MARS handle a lot of that? What components do we need and not need to get them to work together? How do we configure? I'm not getting any input in my scenes in the Game view on Play even though the UI is responding but none of my placed objects, which worked previously, are getting interactions in the editor.
     
  2. mtschoen

    mtschoen

    Unity Technologies

    Joined:
    Aug 16, 2016
    Posts:
    194
    Hi there. I responded on DM, but let's keep this exchange public. I'll repeat what I said here. Sorry I didn't notice this post.

    There isn't really anything in MARS itself that responds to touch/input. Some of our examples use mouse/touch input, but it depends on which one we're talking about, exactly. MARS and XRI don't interact, and shouldn't have any effect on each other.
     
  3. CiaranWills

    CiaranWills

    Unity Technologies

    Joined:
    Apr 24, 2020
    Posts:
    199
    Do you have an ARSession in your scene (with an ARInputManager component)? If there isn't one then MARS will create its own (with an input manager) which might be a source of the problem.
     
  4. scrant

    scrant

    Joined:
    Jun 1, 2017
    Posts:
    73
    Hi @CiaranWills , there was cross posting on another private thread with @mtschoen where I explained myself more fully but basically:

    "I am referring to clicking in Game view using MARS. UI responds and will drop down objects as it does in my build but those AR objects which have XR Interaction Toolkit nodes on them that allows for interaction do not respond at all to mouse clicks or manipulations. How are you to interact with AR objects while prototyping using MARS then? I assume these were built to work with each other? This scene builds to the device and works just fine so trying to get it up and running in MARS to actively/rapidly prototype which is what the majority of users want. Am I missing something?

    Further, ARFoundation (of which my scene was previously based on) throws down AR Session and Session Origin nodes which I assume we DO NOT want in a MARS scene since it will be managing that, correct? I have removed those in the new MARS scene."

    I do have an ARSession and AR Input Manager in the scene. Should I get rid of both? Will XR Interaction Toolkit work with Game view then?

    Many thanks...
     
    Last edited: Jul 30, 2020
  5. kyle_v

    kyle_v

    Unity Technologies

    Joined:
    May 24, 2016
    Posts:
    21
    Hello. I looked into this and have a few more details. The XR Interaction Toolkit touch gesture interactables don't work with a Mouse in Game View yet, only UI interaction currently will work. XR Interaction is still in preview though and this is one of the missing pieces being worked on.

    However on device you can use the AR Scale, Rotate, and Selection Interactables on children of a proxy. Add the AR Gesture Interactor to the Main Camera (still a child under the MARS Session). Then add the interactables to the children of a MARS Proxy or anything else.

    The AR Translation Interactable is a bit more tricky. The translation uses AR Raycast Manager from AR foundation. As Ciaran said, MARS will look for an AR Session and Plane Manager or create its own, so you should add them all to one gameobject (see inspector in screenshot below) so that MARS finds the session and plane manager, and XRI finds the Raycast manager and the same plane manager. Again, Game View input won't work for this component either, but it does work on device.

    Lastly, the Translatable also need to have an extra empty transform between the MARS proxy and the interactable (see the "Starting Anchor" in the hierarchy in the screenshot below), because it is designed to delete its starting parent and that will cause problems if it deletes the proxy or other interactables.

    upload_2020-7-29_18-23-19.png
     
    CiaranWills likes this.
  6. scrant

    scrant

    Joined:
    Jun 1, 2017
    Posts:
    73
    Thanks @kyle_v, I’m still confused. You’re saying the AR Scale, Rotate, and Selection don’t use AR Raycast Manager? And do they have to go under proxies or can they be anywhere in the scene hierarchy? And does MARS create it’s own Session Origin as well? And AR Input Manager I need to make sure is in the scene or will it create it too? Or is it even needed?

    The Translatable has always been a thorn in my side with that extra node it needs to delete and I have no idea why that is necessary looking at the code. Very frustrating but that is another story.

    However, I find it incredible that MARS would be released and yet not compatible with XR Interaction Toolkit which proceeded it, even if in Beta. Obviously people would want to design using that and moving code from both AR Foundation and XRIT. The whole point really for most is using MARS for a good rapid prototyping and removing solution and yet we can’t do that if using tools that came prior. I’m really starting to rethink the whole thing.

    Am I missing something?
     
  7. kyle_v

    kyle_v

    Unity Technologies

    Joined:
    May 24, 2016
    Posts:
    21
    > You’re saying the AR Scale, Rotate, and Selection don’t use AR Raycast Manager?

    I don't think they use the raycast manager from what I can see.

    > And do they have to go under proxies or can they be anywhere in the scene hierarchy?

    They can go anywhere in the scene hierarchy, this was just an example of how to use a proxy to intstantiate an interactable into the scene at every "table"

    > And does MARS create it’s own Session Origin as well? And AR Input Manager I need to make sure is in the scene or will it create it too? Or is it even needed?

    Not sure about this, I added them in to be sure.

    > The Translatable has always been a thorn in my side with that extra node it needs to delete and I have no idea why that is necessary looking at the code. Very frustrating but that is another story.

    Yeah agreed, again XR Interaction is in preview. I'll suggest some changes for this.

    >However, I find it incredible that MARS would be released and yet not compatible with XR Interaction Toolkit which proceeded it, even if in Beta.

    The interaction in the XRI package has been mainly for VR type controller interaction, which we had tested in a Magic Leap app before with MARS. The AR Gestures are newer and still in progress, I agree these need to work together more seamlessly.

    > Obviously people would want to design using that and moving code from both AR Foundation and XRIT. The whole point really for most is using MARS for a good rapid prototyping and removing solution and yet we can’t do that if using tools that came prior. I’m really starting to rethink the whole thing.
    > Am I missing something?

    You are not missing anything, that is indeed the goal. And we do appreciate you testing out these early preview features and giving us feedback.
    Unfortunately there are a couple of different pieces here that are being worked on simultaneously. We decided to release MARS because it can still be used if you are programming your own interactions/gameplay. Hope this helps you decide whether to keep using MARS and/or XR Interaction.
     
  8. scrant

    scrant

    Joined:
    Jun 1, 2017
    Posts:
    73
    Ok thanks @kyle_v I will play around with this some more and look forward to XR interaction Toolkit being more fully integrated. Seems to me the AR developer community is largely developing for platforms that are shipping which is really cell phones at this point so hopefully this integration proceeds quickly. Thanks again.