Search Unity

Resolved ARKit support for iOS via Unity-ARKit-Plugin

Discussion in 'AR' started by jimmya, Jun 5, 2017.

Thread Status:
Not open for further replies.
  1. hanger102

    hanger102

    Joined:
    Jan 3, 2013
    Posts:
    13
    Does anyone else have the problem with the AR remote not using the front camera for the face tracking scenes? Any help would be appreciated.
     
  2. effectzero

    effectzero

    Joined:
    Dec 20, 2012
    Posts:
    5
    Hey guys, with an iPhone X and one of the facial tracking demo scenes as a starting point I tied the blendshapes data to a character model i'm working on for our VR game Bebylon. It's shockingly really good and can get a lot better i think with some finesse. Here's a quick test.



    @hanger102 I'm pretty sure it requires an iPhone X front facing camera to work as its based on ARKit and the depth data provided by that phone. Im not sure if thats what your using?
     
    Gametyme, drhorner, m4d and 2 others like this.
  3. detolox

    detolox

    Joined:
    Aug 15, 2013
    Posts:
    19
    I did it.. in debug mode and development build checked.. Sorry I try it in iPhone 6s and iPhone 6s Plus.. Both of them has A9 processor because the app works in both devices, but when I try to debug the code It doesn't work..
     
  4. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Alright sounds like it might be a problem with your setup then - are you trying to use OpenGL? Check that Metal.framework is referenced in your XCode project.
     
  5. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Fantastic - that's the kind of thing I like to see! Thanks for this, @effectzero
     
    effectzero likes this.
  6. HaimBendanan

    HaimBendanan

    Joined:
    May 10, 2016
    Posts:
    28
    it looks amazing!
     
  7. krisventure

    krisventure

    Joined:
    Mar 24, 2016
    Posts:
    118
  8. krisventure

    krisventure

    Joined:
    Mar 24, 2016
    Posts:
    118
    Why is the ARkit plugin import with Project Settings min iOS version set to 6.0? I thought ARkit only runs on iOS 11? Shouldn't I change this to iOS 11 to avoid crashes on phones not supporting ARkit?
     
  9. suxinren

    suxinren

    Joined:
    Jan 13, 2017
    Posts:
    14
    I added a reference to the Metal.framework, and there's only one mistake left.
    “Projects/AppARKit/MapFileParser.sh: Permission denied” and i know how to solve this mistake.:)

    Jimmya, you're a great guy.Thank you very very much .
     
  10. pixelworshipco

    pixelworshipco

    Joined:
    Sep 29, 2016
    Posts:
    6
    Are we able to use the ARKitRemoteConnection with Front facing camera on iPhone X?
     
  11. hanger102

    hanger102

    Joined:
    Jan 3, 2013
    Posts:
    13
    Yes I am using the iPhone X but it will not use the front camera for the remote, only the rear for some reason. Which makes it very difficult to debug when you have to build every time to the device.
     
  12. hanger102

    hanger102

    Joined:
    Jan 3, 2013
    Posts:
    13
    No still haven't figured it out.
     
  13. hanger102

    hanger102

    Joined:
    Jan 3, 2013
    Posts:
    13
    Last edited: Nov 7, 2017
  14. pixelworshipco

    pixelworshipco

    Joined:
    Sep 29, 2016
    Posts:
    6
  15. smallbit

    smallbit

    Joined:
    Oct 3, 2013
    Posts:
    60
    I will explain the problem more precisely:
    So when running AR I need to adjust the FOV of the unity camera (that displays unity objects). Moving the FOV slider on camera component doesn't affect the camera frustum. I believe this is because the projection matrix comes from m_session. So is there any way I could adjust the FOV of that? Thanks in advance
     
  16. Mobgen-Lab

    Mobgen-Lab

    Joined:
    May 23, 2017
    Posts:
    15
    I am using the detect plane script, so when it detects a plane I am able to place my objects with the Hit Test script.

    The problem is that it takes too much time, even sometimes more than 20 seconds, to detect a plane. I am trying other ARKIT apps and takes 2 seconds to detect the plane. What I am doing wrong??
     
  17. Uzi187

    Uzi187

    Joined:
    Mar 21, 2014
    Posts:
    16
    Hey Jimmya,

    Been testing out arkit stability and having a lot of drift issues with tall objects.

    Is this the normal behavior or a bug? Sometimes the objects even fly out and disappear!

     
  18. piotrtekien

    piotrtekien

    Joined:
    Feb 16, 2015
    Posts:
    7
    @jimmya or @christophergoy

    We (4 game developer students) are trying to combine ARKit with OpenCV (MarkerBased recognition). We have threaded the Main IENumerator method and are trying to send the frame to OpenCV while ARKit is running with:

    Mat(screenshotTexture.height, screenshotTexture.width, CvType.CV_8UC4);
    UtilsCV.texture2DToMat(screenshotTexture, imgMat);

    We can't seem to get the combined texture from the YUVMaterial within the UnityARKit scene and send this to OpenCV so that it can spawn an object for us within the ARKit environment. I also read somehwere in this thread that someone tried this and the object seems to be slowly moving away, some problem with positioning?

    We are using Unity 2017.2f with the latest ARKit and OpenCV versions and an iPhone SE, building on a iMac. Basically, we're just trying to get a texture from the camera to use in OpenCV's Marker recognition. We'd love to hear a solution because we're cracking our brains on this for the past 2 weeks.Thanks in advance!
     
  19. krisventure

    krisventure

    Joined:
    Mar 24, 2016
    Posts:
    118
    I can't see any answers in the FAQ on why ARkit imports with project settings being set to an ARkit-incompatible iOS version (6.0) as the minimal iOS version? Should we leave it as 6.0 for some reason or shall we change that to 11.0 in order to avoid users trying to install it on incompatible devices?
     
  20. Uzi187

    Uzi187

    Joined:
    Mar 21, 2014
    Posts:
    16
    You can use 11.0 but that still won't stop users from installing on incompatible devices as you can install iOS 11.0+ on devices that do not support ARKit (i.e. that don't have A9 chip).

    Use arkit key in Required device capabilities in info.plist do achieve that;
     

    Attached Files:

    krisventure likes this.
  21. jason_sone

    jason_sone

    Joined:
    Jul 7, 2017
    Posts:
    8
    Probably a newbie question, but I'm stumped. I have a Canvas with some panels and buttons at the bottom 10% of my screen that serves as menu for users to swap 3D models by tapping the buttons. When I run the app and click on my buttons, the objects change like I want them to, but the new object gets placed where I tapped on the menu. I just want my objects to swap in the previous object's position without them appearing where I tap. I hope this makes sense. Thanks.
     
  22. effectzero

    effectzero

    Joined:
    Dec 20, 2012
    Posts:
    5
    yeh its a serious drain on productivity. Maybe we can talk Jimmy @jimmya into updating remote sooner than later?!
     
  23. krisventure

    krisventure

    Joined:
    Mar 24, 2016
    Posts:
    118
    Thanks I've found it! Is it okay to just manually add a line and type in arkit or is there some settings in Unity or XCode that will add this item? And there's another list called 'Linked Frameworks and Libraries' which already contains ARkit with 'required' next to it. Is that not enough?
     
  24. Numa

    Numa

    Joined:
    Oct 7, 2014
    Posts:
    100
    Adding it manually will work but if I were you I would add it in code in a post-build script:

    [PostProcessBuild(1)]
    public static void ChangeXcodePlist(BuildTarget buildTarget, string pathToBuiltProject) {

    if (buildTarget == BuildTarget.iOS) {

    // Get plist file
    string plistPath = pathToBuiltProject + "/Info.plist";
    PlistDocument plist = new PlistDocument();
    plist.ReadFromString(File.ReadAllText(plistPath));

    // Add arkit decive capability requirement
    var requirements = plist.root["UIRequiredDeviceCapabilities"].AsArray();
    if (requirements["arkit"] != null)
    {
    requirements.AddString("arkit");
    }

    // Write file
    File.WriteAllText(plistPath, plist.WriteToString());
    }
    }

    No, the "framework required/optional" option is a runtime check (can the app still run if ARKit is missing/unsupported or will it crash?), the plist requirement on the other hand is at the app store level, to prevent people from downloading your app. Without the plist modification people with unsupported devices will still be able to download your app, see a black screen and give you terrible reviews like this: Screen Shot 2017-11-08 at 2.31.11 pm.png
     
    Last edited: Nov 8, 2017
    Terry_Stark, Gametyme and jimmya like this.
  25. Numa

    Numa

    Joined:
    Oct 7, 2014
    Posts:
    100
    Ok, makes sense thanks.
     
  26. krisventure

    krisventure

    Joined:
    Mar 24, 2016
    Posts:
    118
    Thanks a lot for that nice piece of code! So if in a later update I'll want to include non-ARkit compatible devices too then I'd have to remove from the plist and also in the framework list I'd have to set from required to optional, right?
     
  27. Numa

    Numa

    Joined:
    Oct 7, 2014
    Posts:
    100
    If you're targeting iOS < 11 then yes, otherwise I don't think you need to (a non-ARKit compatible device can still run iOS 11, so the framework will be present). Have a look at this: https://stackoverflow.com/questions/33038137/xcode-and-optional-frameworks
     
  28. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    The reason you cannot adjust the FOV is that unity camera needs to keep the FOV of the device camera, otherwise your virtual objects will not look like they are in the right place compared to the camera feed. Thats why we get the projectionMatrix from ARKit to match up the unity scene objects with how they would look like from a camera with the properties of the camera on the device.
     
    KwahuNashoba likes this.
  29. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    You can probably make do with just the Y texture for OpenCV, and you can get that by using the code in UnityARVideo:
    CreateExternalTexture(..), which creates a Texture2D wrapper around the nativeTexture pointer.
     
  30. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    https://forum.unity.com/threads/mak...ie-trapping-touch-events.502054/#post-3279523
     
  31. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,085
    So, regarding face arkit for iphonex... I am surprised there is no face mesh rig that has all the blendshapes set up???
     
    John1515 likes this.
  32. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,085
    is the default face mesh used in arkit face provided somewhere? (yes, it is dynamically distorted, but is the neutral expression face mesh available)
     
  33. piotrtekien

    piotrtekien

    Joined:
    Feb 16, 2015
    Posts:
    7
    We tried and tested it on the iPhone but it still doesn't seem to work. We might be doing something wrong, would you be so kind to take a (quick) look?

    UnityARVideo Script
    https://codeshare.io/aljqyR

    Textur2DMarkerBasedARExample Script
    https://codeshare.io/arOxjA

    PS: We really appreciate the help you're giving us.
     
    Last edited: Nov 8, 2017
    Terry_Stark likes this.
  34. zakmackraken

    zakmackraken

    Joined:
    Jun 16, 2017
    Posts:
    9
    I'm having trouble with AR Remote. I'm not sure its ever worked.

    I using the latest ARKit unity from the asset store and trying to get the AR Remote working with the Editor Test scene. the app runs fine and has a dark background however it never progresses past "Waiting for editor connection" even the iOS console says:

    PlayerConnection accepted from [127.0.0.1] handle:0x12


    connected

    UnityEngine.XR.iOS.ConnectToEditor:EditorConnected(Int32)
    UnityEngine.Events.UnityEvent`1:Invoke(T0)


    (Filename: /Users/builduser/buildslave/unity/build/artifacts/generated/common/runtime/DebugBindings.gen.cpp Line: 51

    The Editor is also waiting for the editor connection but again the console says its connected:

    <i>Autoconnected Player</i> connected
    UnityEngine.XR.iOS.ConnectToEditor:EditorConnected(Int32)
    UnityEngine.Events.UnityEvent`1:Invoke(T0)

    (Filename: /Users/builduser/buildslave/unity/build/artifacts/generated/common/runtime/DebugBindings.gen.cpp Line: 51)


    Whats going on? Note, I never get prompted for granting camera access in the iOS app. Build settings are set to development (and tried both release and debug).

    Thanks.
     
  35. John1515

    John1515

    Joined:
    Nov 29, 2012
    Posts:
    248
    Say I have modelled a 3D face. How to attach blendshapepoints to my mesh? I'm only familiar with skinning in 3DS max, not Unity..
     
  36. John1515

    John1515

    Joined:
    Nov 29, 2012
    Posts:
    248
    Hey, very cool, could you kickstart us and provide a brief explanation how to rig a 3D face mesh to blendshapes in Unity? Having a hackathon here tomorrow, would be great fun to make faces for the X :D
     
  37. planetmatrix

    planetmatrix

    Joined:
    Dec 8, 2015
    Posts:
    38
    Actually it's a bug report:
    Reflection Probe causes ARKit Camera to Stutter a lot. On ARCore, everything is smooth.
    Did any anybody came across it?
     
  38. Petr777

    Petr777

    Joined:
    Nov 8, 2017
    Posts:
    49
    Hello!

    We are observing jittering of objects in current version of plugin (revision 121).
    We add a simple 3d object to scene, and it jitters.
    We found that there is no jittering if we build same scene with version of plugin from revision 89 (Sep, 16).
    We are not using plane detection. We use light estimation.

    Unity 2017.2.0f2, iPhone SE, iPad Pro Xcode 9.1
     
  39. John1515

    John1515

    Joined:
    Nov 29, 2012
    Posts:
    248
    I fixed it by changing some FixedUpdate() to Update()
     
  40. effectzero

    effectzero

    Joined:
    Dec 20, 2012
    Posts:
    5
    Hey man, the key is you have to create the blend shapes (in something like zBrush or mudbox) and connect them to your base mesh in something like Maya or Max before you import it into unity. On apples developer site they list and show pictures of each blend shape you need so you could use that as a reference for sculpting each shape. Or you could get a model probably on turbo squid that already has a bunch of blend shapes made. Or if you have $200 and it doesn't break the hackathon rules, you can upload a single neutral head mesh to http://www.eiskoservices.com and they will send you back that mesh connected to over a hundred blend shapes that you can use in unity.

    Once in unity its pretty easy to take their ARKit demo scene regarding blend shapes and hook the blend shape data they are printing to screen and multiply it by 100 and pipe it into the blend shapes.

    hope that helps and have fun with the hackathon
     
    Gametyme likes this.
  41. curiousbrandon

    curiousbrandon

    Joined:
    Sep 22, 2017
    Posts:
    6
    I have a question about the new ARInterface (this seems like the most relevant thread). When I incorporate this into my current project, the video does not render properly, even with the example scenes (see attached screenshot or this link). However if I build it directly using the example project (as its own project), the video shows up fine. Any idea what might be causing this and where I should look?

    FYI my project had a previous version of ARKit plugin, which I had overwritten with files from the ARInterface example.
     

    Attached Files:

  42. domdev

    domdev

    Joined:
    Feb 2, 2015
    Posts:
    375
    Hi.. will ARKit remote work with lower iphone device? like ip4s?
     
  43. suxinren

    suxinren

    Joined:
    Jan 13, 2017
    Posts:
    14
    Hi jimmya, I run the examples(UnityAROcclusion、UnityARShadows) of Unity ARKit on the Iphone7, but the background is always lightgreen, is this normal?
     
  44. krisventure

    krisventure

    Joined:
    Mar 24, 2016
    Posts:
    118
    Trying to upload my build to iTunes Connect but getting this error: "This bundle is invalid. The key UIRequiredDeviceCapabilities contains value 'arkit' which is incompatible with the MinimumOSVersion value of '8.0'."
    Sounds like leaving the iOS version as when arkit it for 11.0 doesn't sort itself out. Should I change it in player settings to 11 or any other idea?

    The bigger problem is that once I set the min iOS to 11.0, XCode suddenly fails to archive because it says 'Invalid iOS deployment version '-miphoneos-version-min=11.0', iOS 10 is the maximum deployment target for 32-bit targets'. This would suggest my build is 32-bit but it's not. In Unity I have architecture set Universal and backend scripting IL2CPP and that should be enough for a 64-bit build. And in XCode it seems to say Architecture is armv7 arm64 (see screenshot). One of the warnings says 'Check dependencies.
    warning: iOS 11.0.0 does not support armv7.' Any clue about this?
     

    Attached Files:

    Last edited: Nov 9, 2017
  45. krisventure

    krisventure

    Joined:
    Mar 24, 2016
    Posts:
    118
    I've just hit the same error message and I do have Universal selected and IL2CPP backend too. After I added arkit in my required device capabilities list and tried archiving the app it returned the same error: "invalid iOS deployment version '-miphoneos-version-min=11.0', iOS 10 is the maximum deployment target for 32-bit targets. I'd previously archived it successfully before adding arkit to the plist. Any idea what's going on here? Maybe another Unity setting that makes it somehow 32 bit despite the correct architecture setting?

    EDIT: Unity staff answered on another thread: "We are fixing this. If you are targeting iOS 11.0 and later, just pick "ARM64 only" as architecture."

    The workaround works indeed until they fix it.
     
    Last edited: Nov 9, 2017
  46. lonelyaliennathan

    lonelyaliennathan

    Joined:
    Nov 8, 2017
    Posts:
    1
    I'm having some trouble with the ARKit remote connection.

    I'm running unity 2017.2.0f3 with ARKit unity plugin version 1.0.11

    I have build the unityARKitRemote scene to my iPad pro (latest model as of writing and iOS 11.1) with Xcode 9.
    When I start this app, I get a black screen with the message "waiting for editor connection". I then open the default UnityARKitScene scene and add the remote connection prefab. I run the scene in the editor and connect my device. I start the remote connection session.

    In the editor I can see the camera feed with the point cloud data and the planes, but on my mobile device I still see the black screen with the same message. This seems to make the connection pointless as I can't interact with the scene.

    Any idea how to fix this? (the same happens with my iPhone iOS 11.1 by the way)
     
  47. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    You seem to have a lot of problems with your project. Could you start from scratch in a new folder? It looks like you might have missing files or references. Green texture usually means you are not getting any texture updates to your UnityARVideo.
     
  48. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    @christophergoy says you cannot use Universal if you have set ios11 minimum os version. So you have to select arm64.
     
  49. krisventure

    krisventure

    Joined:
    Mar 24, 2016
    Posts:
    118
  50. RyanYN

    RyanYN

    Joined:
    Aug 13, 2013
    Posts:
    20
    My iPad(5th gen) iOS11.1 can't play face tracking by ARKit, only iPhoneX can do that?
    in the apple store, all the iPad parameter support face detection, I'm so confused.
    https://www.apple.com/ipad/compare/
     
Thread Status:
Not open for further replies.