Search Unity

  1. Looking for a job or to hire someone for a project? Check out the re-opened job forums.
    Dismiss Notice
  2. Unity 2020.2 has been released.
    Dismiss Notice
  3. Good news ✨ We have more Unite Now videos available for you to watch on-demand! Come check them out and ask our experts any questions!
    Dismiss Notice
  4. Read here for Unity's latest plans on OpenXR.
    Dismiss Notice

Unity Unity MARS Companion app - Open Beta announcement

Discussion in 'Unity MARS' started by bree-uh, Jan 12, 2021.

  1. bree-uh

    bree-uh

    Unity Technologies

    Joined:
    Jun 18, 2020
    Posts:
    4

    Unity MARS Companion app for mobile is now available in beta







    The Unity MARS Companion app is available in beta for iOS and Android devices, and we’re looking for beta testers. Be one of the first to try out the new mobile app before it’s released.​



    What is the Unity MARS Companion app?

    The Unity MARS Companion app is the newest component of Unity MARS and is now available on iOS and Android devices.


    With the app, you can capture real-world data directly on your device and bring it into the Unity Editor to quickly create and iterate on your AR experience to significantly decrease iteration time and deliver an AR experience that accurately runs in its target environment.


    The Unity MARS Companion gives users the power to perform two primary tasks: data capture and in-situ authoring.


    • Data capture: Using the companion app, you can capture room scans, take pictures, and record video with AR data. Once saved to the cloud, this data will sync directly to the Unity Editor, where you can open it using the Unity MARS authoring environment to create a simulated environment that mirrors where your AR experience is intended to run. Now, you’ll have a more true to reality development environment to author your AR content against.

    • In-situ authoring: The companion app also has authoring functionality, that allows you to create content and layout assets directly on your device. And, for example, if a bug occurs because of lighting conditions or a particular room setup, it’s useful to have the actual device data to reproduce and fix the issue.

    Both of these tasks involve sending and receiving data to and from the Editor, which is done via cloud storage. This means that users all over the world can work on the same project together, and that the data persists between sessions and is shared between users.


    The Unity MARS Companion app is available to Unity MARS users at no additional cost to their existing subscription and includes 10GB of cloud storage per seat. Non-Unity MARS users can also take advantage of the app with limited functionality (see below).


    Please note, the app is currently in beta. While this means that content captured or edited with the beta may not be compatible between versions during the beta or with the final release, we’re very interested in hearing your feedback to further refine and improve the product so it fits your needs.


    What can I do with the Unity MARS Companion app?


    Create and Layout Objects in a Scene




    Scan surfaces and author or preview proxies.


    Create Environment



    Map out an environment by placing corners.


    Record Data



    Record videos, surface data, and camera paths.


    Create Markers



    Capture a marker and add hotspots.


    Unity MARS user vs. Non-Unity MARS user

    Based on your subscription type, you might have different levels of functionality within the app. In the chart below, we explain the differences in these functionalities.


    upload_2021-1-12_11-3-13.png
    *All changes that have not synced to the cloud will be stored locally and can be re-synced once the user authenticates with a Unity MARS entitlement. If the user logs in with a different username/Unity ID, they will be able to make local changes, but will not be able to sync those changes with projects they do not have access to.

    **Unity MARS projects can be synced from the cloud via inputting Project keys or scanning a QR code.



    Join the beta today and share your experience with us.


    To try out the beta, open this link for iOS or Android* on your mobile device. The steps outlined in our documentation will help familiarize you with the workflows.
    *By opting to download Android or iOS versions of the Unity MARS Companion App, you agree to our Privacy Statement


    To use the Unity MARS Companion app alongside the Unity MARS authoring environment, we recommend you use Unity MARS 1.2, as well as Unity version 2019.3.0f6 or newer.


    As you try out the Unity MARS Companion app, we’d really like to hear about your experience. For product support, troubleshooting problems, sharing projects and feedback, and general discussion about the app, comment in this forum thread or share your feedback in this survey.
     

    Attached Files:

    Last edited: Jan 14, 2021
  2. Blinxel_AR

    Blinxel_AR

    Joined:
    Jan 26, 2015
    Posts:
    39
    Installed it. Used it. Love it.
    Needs a lot of work.
    Right now the planes are waaaaaay too messy to be useful in anything but the most empty environments. But I can see that this tool will become an indispensable part of our tool kit for our MARS based work.
     
  3. mtschoen

    mtschoen

    Unity Technologies

    Joined:
    Aug 16, 2016
    Posts:
    86
    > Installed it. Used it. Love it.

    Thanks!

    > the planes are waaaaaay too messy to be useful

    Is this on iOS or Android? We are (for now) just exposing all of the planes the platform is providing, but I agree that it can be difficult to pick between planes that are overlapping. We will be working on future improvements to filter out overlapping planes and/or help you pick between them when they overlap. If you have any ideas or suggestions, please let us know!
     
    bree-uh likes this.
  4. Blinxel_AR

    Blinxel_AR

    Joined:
    Jan 26, 2015
    Posts:
    39
    Sorry, should have said. This was on an Android, Galaxy Note 10+ with the depth camera.
    I'm sure you've seen the 8-dimensional physics model that is created when you scan an area. Even a very plain one. I scanned an empty apartment room (5x5m, 3m ceiling) and even though it was an empty box (no furniture) the planes it detected were Lovecraftian in their non-Euclidian overlappyness.
    I was musing on this at the time, and thought that what we had here was a really interesting use case for machine learning... to ID walls, corners, ceilings and so on. But I assume that's out of scope.

    Probably a more helpful suggestion might be to allow the user to mark corners and then base / refine the detected planes from those?

    I tried the 2nd function of defining the floorplan. It took a few tries to nurse a result without it freaking out and forgetting the orientation of the entire plan.
    One method that worked well was to step close to any corners (convex or concave) and mark the corner from about a meter away. Then to step right back maybe 3 meters away to drag the blue pole along the wall, letting the camera see as much of the wall, floor, ceiling and any other features as possible. Swooping in and then out and back in again seemed to stop the app from loosing tracking halfway through. Felt like dancing too. So that's a bonus.

    I've not tried to import the result into Unity yet, but I will. I'm assuming I can place proxies in certain places in Editor, and then if I built a scene from this data it would place the content on the proxies in the right places... at least I hope that's the goal.

    Bottom line, this app feels like it's a beta for an app my company will NEED to use daily in our work, and I really hope it evolves. Please keep up the good work.
     
  5. mtschoen

    mtschoen

    Unity Technologies

    Joined:
    Aug 16, 2016
    Posts:
    86
    > Probably a more helpful suggestion might be to allow the user to mark corners and then base / refine the detected planes from those?

    Interesting suggestion. For the Environment and Data Recording flows, the intent is to let you capture exactly what the device is going to give you, so in a sense the "Lovecraftian overlappyness" is something that we actually want to capture in order to replicate in the Editor. That way, you can make sure your proxies and C# code can handle bad tracking gracefully. One way or another, if the platform is giving you bad results in the companion app, you can expect your users to encounter the same bad results "in the wild" when they run your app.

    To be honest, it sounds like your empty apartment is creating a difficult environment for AR Core to give you consistent camera tracking and high-quality surface extraction. Furniture and clutter actually help with camera tracking, because the SLAM algorithm needs fiducial markers (recognizable details) to work properly. Do you get the same results in that space from a basic MARS or AR Foundation app? There could be some bug or performance issue in the MARS Companion App that we can fix to improve the results you are seeing, so it's good to know how other AR apps behave as a baseline. You may get better results on ARKit, if that's available to you. We also plan to improve the floor plan feature by letting you control where the base of the corner post sits on the screen. That should give you a bit more control over where it lands while also keeping more of the room in view.

    For the Proxy Scan Flow, we do actually want to sanitize things a bit more so that you can do your work, even if you're not getting ideal results from the platform. However, we don't want to lead you into a situation where you are authoring against data that won't exist later on when your users run the experience. If we let you tag a surface in the Proxy Scan Flow, but there's no way to do that in a build of your app, you'll end up creating experiences that only work in the companion app, which isn't what we want.

    With all of that said, you may be working on a location-based experience, or we may one day get more rich semantic information from the platforms (this already exists to some extent today) which means that you can make assumptions about the space beyond the raw data you get from plane scanning. We will be working on ways to leverage this type of information in future versions of this feature. Currently, you can edit the `Trait` property of a `SemanticTagCondition` which may match up to some semantic tag for data provided by Synthetic Objects or some future data provider that supports semantic tags.

    A simple step toward what you are describing would be to just let you hide planes you don't want to interact with. But even that gets tricky, because we still need to allow your content to match to those planes, otherwise we're letting you "cheat" during authoring in a way that will still be an issue when users run your app. Then it might be confusing if you start editing the Plane Size condition on your proxy and it jumps over to a plane that doesn't exist.

    We're actively exploring these kinds of improvements, and the platforms are always improving their algorithms. Stay tuned, and thanks again for the feedback!
     
  6. Blinxel_AR

    Blinxel_AR

    Joined:
    Jan 26, 2015
    Posts:
    39
    > Do you get the same results in that space from a basic MARS or AR Foundation app?
    We've built a MARS app we're about ready to launch, and yeah... it's got planes all over the place. So the companion app is certainly reflecting the real experience of the customer.
    We have seen a lot of items in our app floating on planes that should just not exist. Can I assume that this is something undesirable that you folks are working to eliminate?
     
  7. mtschoen

    mtschoen

    Unity Technologies

    Joined:
    Aug 16, 2016
    Posts:
    86
    > Can I assume that this is something undesirable that you folks are working to eliminate?

    Yup! PlaneSizeCondition and ElevationCondition are a good start at coming up with "a good plane" but we want to go deeper. For the situation you're describing, we really need a "non-overlapping condition" but that's a trickier problem to solve efficiently on a mobile device. Putting content which you want to prevent from overlapping in a DistanceRelation (part of a Proxy Group) is a good way to prevent overlapping planes from turning into overlapping content. You might also see if Proxy Forces are a good solution to dealing with planes that don't behave as well as you'd hoped.

    As always, we want to prioritize problems that our users are encountering in the real world, so thanks for the feedback!
     
  8. GameDevSK

    GameDevSK

    Joined:
    Jan 3, 2020
    Posts:
    2
    Why it is not installing in android?
    I tried too many times, it is showing loading screen
     
  9. mtschoen

    mtschoen

    Unity Technologies

    Joined:
    Aug 16, 2016
    Posts:
    86
    I'm sorry to hear that you're having trouble. What device and Android OS version are you using? Where do you see the loading screen? Is it in Google Play? Is it the "Made with Unity" splash screen?
     
  10. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    214
    I think more often the problem has to do with updating the planes. I have no idea if this is happening at the Unity or iOS level, but I've never seen a plane get deleted or update in a subtractive way after it's been initially created. This is often an issue if it detects it wrong initially, or something in the scene has moved. Are Unity's planes getting subtracted or chandged properly when they change on iOS? this is unrelated to this thread I guess but...
     
  11. Blinxel_AR

    Blinxel_AR

    Joined:
    Jan 26, 2015
    Posts:
    39
    ooooh, really good point.
    Once a plane exists, it stays there, even when the app should have detected that there's nothing there after all.
     
unityunity