Search Unity

ARKit 3

Discussion in 'Handheld AR' started by ina, Jun 5, 2019.

  1. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    774
    Is ARKit 3 fully supported by ARFoundation?

    Or is the existing ARKit plugin getting updated soon?
     
  2. louisch

    louisch

    Joined:
    Apr 19, 2014
    Posts:
    3
    ARKit 3 was announced yesterday, and will release in iOS 13, which hasn't been released yet.
     
    cabrera-juan likes this.
  3. MassiveTchnologies

    MassiveTchnologies

    Joined:
    Jul 5, 2016
    Posts:
    31
    iOS 13 beta has already been released to developers. And Unity team can work on the integration already. I wonder how long will it take to integrate features like people occlusion and motion capture (and possibly with support for Mecanim). And ETA from the team would be greatly appreciated.
     
  4. Staus

    Staus

    Joined:
    Jul 7, 2014
    Posts:
    8
    I don't expect it to be a small task for the team to "just" implement these things and it's only 2 days old news. However, it would be really nice to be kept informed on when/if and to what degree these tools could be expected. Many ongoing projects could benefit from them and I would rather not optimize for issues that are automatically fixed very soon with the ARkit3 :)
     
    eco_bach likes this.
  5. Staus

    Staus

    Joined:
    Jul 7, 2014
    Posts:
    8
    Oh! I just realised that the ARkit plugin has been deprecated and we should now use ARFoundation?? Isn't ARFoundation only catering to the features that the platforms have in common? Something like body tracking, body occlusion etc. are currently unique to iOS. Does that mean that Unity will down prioritize those features until ARcore catches on?
     
  6. KLWN

    KLWN

    Joined:
    Apr 11, 2019
    Posts:
    24
    I hope not, that would be a huge bummer for Unity Devs around the world. My guess is they will implement ARKit3's new features in order to be a state-of-the-art dev platform for which Unity is known for.
     
  7. MassiveTchnologies

    MassiveTchnologies

    Joined:
    Jul 5, 2016
    Posts:
    31
    We seriously hope that’s not the case too. Maybe they would have platform specific features on top of the cross platform functionality. Hoping the devs would clarify more on this matter.
     
  8. Jelmer123

    Jelmer123

    Joined:
    Feb 11, 2019
    Posts:
    24
    No, AR Foundation also features ARKit-only functions (for instance object tracking).
    But what worries me is that the ARKit plugin has always been updated very quickly. In 2017, ARKit was available in Unity within days (I suppose they worked on it before the beta release). But with AR Foundation I've seen that things takes much longer time. Only recently image tracking was added, while this has been around for ages in ARKit and ARCore.
    edit: Unity support for ARKit3 https://blogs.unity3d.com/2019/06/06/ar-foundation-support-for-arkit-3/
     
    Last edited: Jun 6, 2019
  9. MassiveTchnologies

    MassiveTchnologies

    Joined:
    Jul 5, 2016
    Posts:
    31
    We were wondering too if we could get our assets out of Unity into RealityKit through USDZ for example. RealityKit has many cool features that the current Unity struggles with (using image effects like motion blur in AR without affecting up the tracking and severely impacting the performance, accurate camera based depth of field, raytraced soft shadows and many more)
     
    Last edited: Jun 6, 2019
    Saicopate and Jelmer123 like this.
  10. Jelmer123

    Jelmer123

    Joined:
    Feb 11, 2019
    Posts:
    24
    There is a USDZ preview package for Unity that allows you to export assets to USDZ I think. Check the package manager :)
    And yes, I'm also going to look into RealityKit, it looks very promising. Motion tracking, depth of field blur, HDR environment maps, camera grain, good looking realtime soft shadows on mobile, recording an AR session for in-editor testing, continues mapping&relocalization across multiple devices, easy multiplayer.

    Mostly also possible in Unity, but with RealityKit it's by default it seems
     
    Last edited: Jun 6, 2019
  11. Jelmer123

    Jelmer123

    Joined:
    Feb 11, 2019
    Posts:
    24
  12. MassiveTchnologies

    MassiveTchnologies

    Joined:
    Jul 5, 2016
    Posts:
    31
    That was fast! Exciting!
    I can’t see a mention of improvements to the post processing workflow in AR to match RealityKit level. I hope the devs address that soon.
     
  13. eco_bach

    eco_bach

    Joined:
    Jul 8, 2013
    Posts:
    1,275
    Wow. "people occlusion features are available only on iOS devices with the A12 Bionic chip and ANE." Anyone know which specific devices these are?

    Also, just to confirm

    https://github.com/Unity-Technologies/arfoundation-samples

    This repo contains latest ARkit 3 features? Not perfectly clear either from the blog post or reading the manifest.json file.

    Also am wondering about LWRP + ARkit3 support.
     
    Last edited: Jun 6, 2019
  14. KLWN

    KLWN

    Joined:
    Apr 11, 2019
    Posts:
    24
  15. tdmowrer

    tdmowrer

    Unity Technologies

    Joined:
    Apr 21, 2017
    Posts:
    480
    We published a version of ARFoundation today that supports the new features in ARKit 3. Please see our blog post for more details.
     
  16. tdmowrer

    tdmowrer

    Unity Technologies

    Joined:
    Apr 21, 2017
    Posts:
    480
    The samples repo has now been updated with new samples for ARKit 3 features.
     
  17. Jelmer123

    Jelmer123

    Joined:
    Feb 11, 2019
    Posts:
    24
    edit: whoops also all A12X devices of course:
    from https://en.wikipedia.org/wiki/Apple_A12 & https://en.wikipedia.org/wiki/Apple_A12X
     
    Last edited: Jun 6, 2019
  18. edee1337

    edee1337

    Joined:
    Apr 10, 2013
    Posts:
    9
    Thank you for the update!! Looks like I'm getting the following error when trying to deploy the ARFoundation 1.5 LWRP sample to iPhone X running iOS 12.1.2:
    dyld: Symbol not found: _OBJC_CLASS_$_ARMatteGenerator
     
  19. tdmowrer

    tdmowrer

    Unity Technologies

    Joined:
    Apr 21, 2017
    Posts:
    480
    The latest ARFoundation 1.5 (and also 2.2) currently requires Xcode 11 beta. That linker error is expected if using Xcode 10 (or less).
     
  20. newguy123

    newguy123

    Joined:
    Aug 22, 2018
    Posts:
    134
  21. Jelmer123

    Jelmer123

    Joined:
    Feb 11, 2019
    Posts:
    24
    The device also needs to run iOS13
     
    edee1337 likes this.
  22. eco_bach

    eco_bach

    Joined:
    Jul 8, 2013
    Posts:
    1,275
    Watching the ARkit3 deep dive am wondering how people occlusion is handled under the hood in ARFoundation.

    -Scenekit + ARSCNView
    or
    - Custom compositon thru ARMattegenerator

    Screen Shot 2019-06-07 at 11.05.27 AM.png
     
    Last edited: Jun 7, 2019
  23. KLWN

    KLWN

    Joined:
    Apr 11, 2019
    Posts:
    24
    Does anybody know what kind of impact "Human Segmentation Stencil Mode" and "Human Segmentation Depth Mode" has?
     
  24. edee1337

    edee1337

    Joined:
    Apr 10, 2013
    Posts:
    9
    Yep that was it, device needed to be on iOS 13. Thank you!
     
    Jelmer123 likes this.
  25. Jelmer123

    Jelmer123

    Joined:
    Feb 11, 2019
    Posts:
    24
    Apple mentioned in their WWDC talk they now can also detect white walls (feature-less surfaces) using some AI. Is this also available in ARFoundation? I can't find any documentation on that.

    from 41:00: https://developer.apple.com/videos/play/wwdc2019/604/

    They call it ML-enhanced plane estimation
     
    konsnos likes this.
  26. Stage672

    Stage672

    Joined:
    Jul 27, 2016
    Posts:
    5
    I am playing around with with the simultaneous world and face tracking feature (RearCameraWithFrontCameraFaceMesh sample). It is implemented as a toggle, toggling between viewing the back- or front-facing camera's feed, while the tracking data from both managers are accessible.

    However, what happens exactly to the position and rotation of the face when we are using the front-facing mode?

    The face position seems to be relative to the device's starting position (0,0,0) as all other trackables, but what about the face rotation? It seem to be broken, where just certain face rotations seem to be tracked (e.g. only y-axis, while the other two do not change). To what is this rotation relative to?
     
    Last edited: Jun 14, 2019
  27. Fl0oW

    Fl0oW

    Joined:
    Apr 24, 2017
    Posts:
    17
    It's described in the Unity Blog post (https://blogs.unity3d.com/2019/06/06/ar-foundation-support-for-arkit-3/). Basically, stencil mode tells you if a pixel contains a human or not, depth mode also estimates the distance of that human pixel from the device. I guess stencil mode is much lighter in terms of performance impact, as it only requires one ML network to run, while depth estimation requires (at least) an additional one. So depending on your use-case, stencil mode might be useful to optimise performance if you only need segmentation, not distance-based occlusion.