Search Unity

  1. Improved Prefab workflow (includes Nested Prefabs!), 2D isometric Tilemap and more! Get the 2018.3 Beta now.
    Dismiss Notice
  2. The Unity Pro & Visual Studio Professional Bundle gives you the tools you need to develop faster & collaborate more efficiently. Learn more.
    Dismiss Notice
  3. Let us know a bit about your interests, and if you'd like to become more directly involved. Take our survey!
    Dismiss Notice
  4. Improve your Unity skills with a certified instructor in a private, interactive classroom. Watch the overview now.
    Dismiss Notice
  5. Want to see the most recent patch releases? Take a peek at the patch release page.
    Dismiss Notice

ARKit support for iOS via Unity-ARKit-Plugin

Discussion in 'ARKit' started by jimmya, Jun 5, 2017.

  1. deraggi

    deraggi

    Joined:
    Apr 29, 2016
    Posts:
    85
    Hi,

    it took me forever to find this information, as I was struggeling with ARKit using OpenGLES. I'd suggest to add this to the description page of the plugin
     
  2. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    730
    Just change the tiling parameters on the material according to the scale of the plane.
     
    OLGV likes this.
  3. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    730
  4. 3d_Artist1987

    3d_Artist1987

    Joined:
    Jul 9, 2012
    Posts:
    640
    Can ARFoundation support arkit 2.0 (beta) ?
     
  5. TijsVdV

    TijsVdV

    Joined:
    Mar 7, 2017
    Posts:
    11
    I am using ARFoundation and i want to start a new ar session. I start my scene without an ArSession object and i instantiate one when i press a button. This works fine. When i want to make a new session i destroy my ArSession object and then instantiated a new one. But this gives me the error: InvalidOperationException: Cannot start AR session because there is no session subsystem. What am i doing wrong? I do still keep the same AR Session Origin object.

    I just tested by putting everything from AR in one seperate scene. When i load that scene for the first time all is fine. When i then go back to my main scene and then back to the ar scene i get the same error.
     
    Last edited: Jun 22, 2018
  6. OLGV

    OLGV

    Joined:
    Oct 4, 2012
    Posts:
    25
    You mean ARFoundation instead of ARInterface?

    ARFoundation is a package containing all similar features provided by both ARKit & ARCore (eg: Plane detection, Light estimate, Hit testing, AR Scaling, etc). The advantage is that using ARFoundation what you implement is still valid when you switch platform between the two and you use one feature which would work on both iOS or Android once you build.

    If you want very new, or specific features, which one provider (Apple or Google) offers but the other not, then you have to use the specific package (eg: ARKit or ARCore) for the device you build. For example: if you want to use AREnvironmentProbeAnchor (ar reflection probes) this is supported by ARKit only at the moment - in this case, you have to use ARKit.
    You can use them together, take some common functions from ARFoundation (which you know will be operational for both platforms) and then also bring ARKit or ARCore for very specific features.
     
    rob_ice and JelmerV like this.
  7. adrian-taylor09

    adrian-taylor09

    Joined:
    Dec 22, 2016
    Posts:
    3
    What is the accepted way to persist an ARKit session across scenes?
    In my app I have Scene A with no AR functionality and scene B that uses ARKit. I want to be able to switch from Scene B to Scene A and then back to Scene B again and keep the same tracking data....
     
  8. OLGV

    OLGV

    Joined:
    Oct 4, 2012
    Posts:
    25
    Could you start with scene B and then
    LoadSceneMode.Additive
    scene A, discarding it at a later stage?

    Not sure if you can disable enable the AR tracking for now, as I understood the AR tracking is requested by the Unity from the OS and can be requested only when the app is starting - but I might be wrong on this as things advance fast.
     
  9. petey

    petey

    Joined:
    May 20, 2009
    Posts:
    1,204
    Hi there,
    Just jumped back onto this after a bit of a break and my project wont work, due to me using iVidCapPro (which doesn't run on metal).
    I hadn't realised replaykit was supported by Unity, just wondering does that have a way of only recording a portion of the screen? I might be wrong but I feel that is something a lot of the people using iVidCapPro are trying to achieve.

    Thanks,
    Pete
     
    Last edited: Jun 24, 2018
  10. novaVision

    novaVision

    Joined:
    Nov 9, 2014
    Posts:
    253
    Creating some complex AR experience and trying to make it able to test via editor I met 3 common issues/problems using ARKit
    1. ARPlaneAnchor identifier randomly become empty or any other value. The only way to get it's correct ID is at event when it been detected. It's known issue mentioned here . Furthermore ARPlaneAnchor class instance is replaced - I trying to cache it and compare after but cached instance != compared instance. That is really annoying. The question is - why that happens and when will be fixed?

    2. In some cases I need to disable session tracking setting up detection to none. Using UnityARAnchorManager I found that after restoring detection (hor/ver/hor+ver) previously detected planes are ignored, and new planes detected just in same place. Is it expected behaviour or some kind of bug?

    3. In my app I switch plane detection mode runtime from horizontal to vertical and to hor+ver after that. I tried to setup remote testing for that. Initially each time on
      Code (CSharp):
      1. UnityARSessionNativeInterface.RunWithConfigAndOptions >  CreateRemoteWorldTrackingConnection
      creates new ARKitRemoteConnection gameObject. That caused many troubles, so I tried to re-initialize existed object, calling
      Code (CSharp):
      1. ARKitRemoteConnection.SendInitToPlayer()
      method. That changed detection options correctly, but on new plane detection that ARAnchorAddedEvent event been called more then 1 time, depending on how many time I changed detection and re-initialised remote connection component. Is there any solution or other approach to implement my idea?
     
  11. BrandonYJ

    BrandonYJ

    Joined:
    Jan 5, 2017
    Posts:
    3
    Summary: After downloading the ARkit 2 object scanning from https://developer.apple.com/documentation/arkit/scanning_and_detecting_3d_objects . After i build it in Xcode to my Ipad IOS 12 beta it keep crashing.


    Steps to Reproduce: Download the sample code from https://developer.apple.com/documentation/arkit/scanning_and_detecting_3d_objects > build with xcode 10.0beta to Ipad IOS 12 beta > open the app and it crash.


    Expected Results: expected result to be able to open the app and scan the object.


    Actual Results: cant even open the app since it keep crashes when you open it.


    Version/Build: 12
     
  12. Steamc0re

    Steamc0re

    Joined:
    Nov 24, 2014
    Posts:
    106
    I am having issue with using ARKit Remote in 2017.4.5f1 and iOS 11.4, Latest XCode.

    It connects , and sends video, but instead of sending data and updating the camera, planes, and points, the video has a 1 second "ghost" as you can see in the below image. This after-image stays for about a second and then updates again. Anyone have this problem?

    upload_2018-6-26_11-32-21.png
     
  13. novaVision

    novaVision

    Joined:
    Nov 9, 2014
    Posts:
    253
    That is known issue because of too large data sent in editor. Check this solution, that will help to remove "ghost" but unfortunately will be still very laggy
     
    Steamc0re likes this.
  14. Pilltech101

    Pilltech101

    Joined:
    Jul 25, 2014
    Posts:
    11
    Would anyone have an Idea on how to capture an image using Arkit? In the past I have used
    WebCamTextures to take pictures but if my info is correct WebCamTextures do not work well with the Arkit scene.
     
    Last edited: Jun 26, 2018
  15. dyuldashev

    dyuldashev

    Joined:
    Mar 3, 2016
    Posts:
    54
    I have a very simple question. I've built my whole project using AR Intefrace. The examples in the Github repo use GUI instead of UI. However, I changed everything to UI, but the ground plane is continuing to create objects when I press on UI buttons/sliders. It means that the ARGameObject Layer(Layer #11) is being rendered before the UI layer(Layer #5)?How can render the UI layer above all the layers so that when I press on UI buttons what's behind those buttons won't get affected? Thanks.
     
  16. Griffo

    Griffo

    Joined:
    Jul 5, 2011
    Posts:
    652
    Cant you just use 'IsPointerOverUIObject' ?

    Code (CSharp):
    1.     //          ----------------------------
    2.     private void Update ()
    3.     {
    4.         //if ((Input.GetMouseButtonDown(0)) && (!EventSystems.EventSystem.current.IsPointerOverGameObject(0)))
    5.         if ((!IsPointerOverUIObject()) && (Input.GetMouseButtonDown(0)) && (!GlobalVarsScript.scenePlaced))
    6.         {
    7.             Debug.Log("A0 .. placed scene .. ");
    8.  
    9.             Ray ray = Camera.main.ScreenPointToRay(Input.mousePosition);
    10.             RaycastHit hit;
    11.  
    12.             if (Physics.Raycast(ray, out hit, maxRayDistance))
    13.             {
    14.                 Debug.Log(" XX .. RaycastHit hit Layer is .. " + hit.transform.gameObject.layer + "    RaycastHit hit name is .. " + hit.transform.name);
    15.             }
    16.             //we'll try to hit one of the plane collider gameobjects that were generated by the plugin
    17.             //effectively similar to calling HitTest with ARHitTestResultType.ARHitTestResultTypeExistingPlaneUsingExtent
    18.             if (Physics.Raycast(ray, out hit, maxRayDistance, collisionLayer))
    19.             {
    20.                 //we're going to get the position from the contact point
    21.                 m_HitTransform.position = hit.point;
    22.                 Debug.Log(string.Format("x:{0:0.######} y:{1:0.######} z:{2:0.######}", m_HitTransform.position.x, m_HitTransform.position.y, m_HitTransform.position.z));
    23.  
    24.                 //and the rotation from the transform of the plane collider
    25.                 m_HitTransform.rotation = hit.transform.rotation;
    26.                 StartCoroutine(Wait());
    27.             }
    28.         }
    29.     }
    30.     // Taken from http://answers.unity3d.com/questions/1115464/ispointerovergameobject-not-working-with-touch-inp.html#answer-1115473
    31.     private bool IsPointerOverUIObject ()
    32.     {
    33.         PointerEventData eventDataCurrentPosition = new PointerEventData(EventSystem.current);
    34.         eventDataCurrentPosition.position = new Vector2(Input.mousePosition.x, Input.mousePosition.y);
    35.         List<RaycastResult> results = new List<RaycastResult>();
    36.         EventSystem.current.RaycastAll(eventDataCurrentPosition, results);
    37.         return results.Count > 0;
    38.     }
    39.     //          ----------------------------
     
    Burglecut likes this.
  17. HisDarkHumour

    HisDarkHumour

    Joined:
    Aug 23, 2017
    Posts:
    2
    Hi Thanks for help I received here, got me past some early mind boggling questions. I've released a game on iOS, hope you guys will check it out, its free! and I'm always interested in feedback for future versions.

    AppStore.com/MechagamiWorld

     
    castana1962 likes this.
  18. astorms

    astorms

    Joined:
    Jan 31, 2018
    Posts:
    13
    Did you ever figure this out? I have the same problem and want use Unity coordinates to place the anchors.
     
  19. astorms

    astorms

    Joined:
    Jan 31, 2018
    Posts:
    13
    I'm trying to figure out how to use the UnityARUserAnchorComponent effectively. My understanding is I should place one of these components on every virtual object that I want to be "attached" to the real world. In my app, i want to use one image target to generate several large virtual objects (approximately 5m x 5m) in fixed locations throughout a large area (100m x 50m).
    1. Would instantiating those objects (with a UnityARUserAnchorComponent attached) at the beginning of the session using Unity coordinates "plant" those objects on the real world realistically as I walk amongst them? Or are they only meant to be placed within a few feet of the ARKit Camera?
    2. In the default UnityARUserAnchorComponent, the ARUserAnchorUpdatedEvent delagate does not have any code in it. Are we expected to update the gameobject's transform in that method to keep the gameobject placed most accurately in the real world? Or is that done automatically under the hood?
     
  20. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    730
    Please follow along with the documentation and examples given. This is the basic stuff that is provided with the examples.
     
  21. unity_tqlj7BfvM9vR_A

    unity_tqlj7BfvM9vR_A

    Joined:
    Mar 1, 2018
    Posts:
    4
    I am walking into random crashes (1/100) when switching models within a Unity ARkit App. The app spawns an container with different models via the GenerateImageAnchor script and then switches child models with a small animation within that container (so the container/parent remains the same). See the crash log below.

    Seems the error has something to do with: Termination Reason: Namespace SIGNAL, Code 0xb and crashes within a Thread about: "Name: Dispatch queue: com.Metal.CompletionQueueDispatch", which has to do with CoreFoundation and CoreVideo?

    Any ideas on how to fix these random occurring errors? It doesn't seem to trigger any null reference exceptions so that cannot be the problem.

    Thanks in advance!

    Code (CSharp):
    1.  
    2. {"app_name":"ardisplay","timestamp":"2018-07-01 12:34:26.37 +0200","app_version":"0.1","slice_uuid":"ef2c8a0a-fb23-3b56-9d46-11b679551981","adam_id":0,"build_version":"0","bundleID":"com.fx.ardisplay","share_with_app_devs":true,"is_first_party":false,"bug_type":"109","os_version":"iPhone OS 11.3 (15E5216a)","incident_id":"E9A9720C-A866-4DF9-8503-42EFF223B946","name":"ardisplay"
     
  22. JelmerV

    JelmerV

    Joined:
    Nov 29, 2012
    Posts:
    156
    Is any ARKit plugin compatible with the Lightweight render pipeline yet?
     
  23. JelmerV

    JelmerV

    Joined:
    Nov 29, 2012
    Posts:
    156
    I think you have to keep tracking alive you for persistence accorss scenes (unless using ARKit2 with ARWorldAnchor of course)
     
  24. JelmerV

    JelmerV

    Joined:
    Nov 29, 2012
    Posts:
    156
    The easiest is taking a screenshot https://docs.unity3d.com/ScriptReference/ScreenCapture.CaptureScreenshot.html
    But if you don't want the UI visible, I don't know. I tried to solve that too but haven't succeeded yet.
     
  25. astorms

    astorms

    Joined:
    Jan 31, 2018
    Posts:
    13
    I have done so extensively, and I know how it "works" from the Unity stand point, but here's what I want to understand: If I instantiate a world 50m x 50m, and there's an object at the edge (thus 50m away), will ARKit keep that object in place in the real world based on THAT object's position (0, 50), or is it only truly tracking the initial object at (0,0), and therefore will only keep it in place relative to the origin object?
     
  26. pretender

    pretender

    Joined:
    Mar 6, 2010
    Posts:
    810
    Hi guys!
    I was running the examples and using UnityARGeneratePlane script, i was wondering if there is a possibility to have more detailed mesh or precision? Is there any way to improve this?

    thanks!
     
  27. Tuitive

    Tuitive

    Joined:
    Mar 25, 2013
    Posts:
    31
    How can I get the current AR session's config settings? I expected to be able to use
     UnityARSessionNativeInterface.GetARSessionNativeInterface().configuration
    or something like that.
     
  28. SE_BERLIN

    SE_BERLIN

    Joined:
    Nov 24, 2017
    Posts:
    6
    I just downloaded the last version from Bitbucket, including ARKit2 samples.

    I created a simple UI, just one persistent button which loads the next scene from build settings. Works fine. Until i load any scene using face tracking. For example the BlendShape Sample scene loads and works, but trying to load any other scene after this one (or any other face tracking sample) will result in a crash on the device.

    Xcode only logs this:
    libc++abi.dylib: terminating with uncaught exception of type Il2CppExceptionWrapper

    (lldb) Message from debugger: Terminated due to signal 6

    and points to main.mm line 33
    UIApplicationMain(argc, argv, nil, [NSString stringWithUTF8String: AppControllerClassName]);

    which is not helpful.

    I could not find any information on how to explicitly stop a session. Only an info to Don´t destroy on
    load any AR Object. Which doesn't seam to make to much sense with dozens of scenes.

    Any tips on this?

    Just asking because it´s helpful to toggle quickly through all demo scenes on the device.
     
  29. Studio-Raef

    Studio-Raef

    Joined:
    Jul 1, 2013
    Posts:
    5
    I was wondering if you can use Realtime Global Illumination with ARKit, because I've got my scene set up to do that, but it seems like it's not using the skybox to illuminate my object.

    I can get it to work with Baked GI, but that makes more sense. The Ambient AR light, makes everything look a bit bland and I was wondering if I could use an image to add that extra level of realism, but I can't seem to find any information of people who use it.
     
  30. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    730
    ARKit 2.0 beta has environment probe anchors - check them out to do more realistic lighting and reflectivity. See https://forum.unity.com/threads/arkit-2-0-beta-support.534639/ for more info. See https://twitter.com/jimmy_jam_jam/status/1017585612499533824 as well for another example
     
  31. Studio-Raef

    Studio-Raef

    Joined:
    Jul 1, 2013
    Posts:
    5
  32. MarsGus

    MarsGus

    Joined:
    Mar 27, 2016
    Posts:
    5
    I have some problems.....I tried the tips from this thread but it does not seems to work for me. Anybody some idea?
    I want to use the ARKit i downloaded via the assetstore.

    This is what I have done.
    • Using unity version 2018.1.7f1
    • downloaded the ARkit unity plugin in my project.
    • Iphone SE installed Unity Remote 5
    • In Editor settings in unity I selected my Iphone SE
    • Build - unityArKitRemote on Debug as development Build
      • build this in a test folder not in my project folder
    • loaded the EditorTestScene
    • started the unite remote on my iphone
    • then hit play in the editor
    • screen in editor turns green and also on my iphone. I see the message connect player in the console menu.
    • I also receive alot of error logs about: screen position out of view frustum. I do not know why??
    • when i select my iphone editor in the console menu I receive the error failed to connect to player.
    I think I follow the correct guide but am i doing something wrong?
     
  33. MarsGus

    MarsGus

    Joined:
    Mar 27, 2016
    Posts:
    5
    found the solution for this by https://bitbucket.org/Unity-Technol...mits/9ce304c8068fd70473007d576331d862a1ceb094
    But I am still not able to connect to my iphone
     
  34. Studio-Raef

    Studio-Raef

    Joined:
    Jul 1, 2013
    Posts:
    5
    I was wondering about RunWithConfigAndOptions().

    Is this the best way to change the detection for horizontal,Vertical,...?

    In editor it keeps creating new ARKitWorldTrackingRemoteConnection, does the runwithconfig create extra objects aswell in the builds?

    Do we have to manually delete these?
    I was also wondering about image recogition. Once I call RunWithConfig, I lose Image Detection.
    I am guessing this is because the detection images are on the AR camera Manager.
     
  35. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    730
    RunWithConfig and RunWithConfigAndOptions are both expensive operations, and I would try to minimize the number of calls I make to them. It might be better to detect all the stuff you need instead of doing them piecemeal, unless you have distinct modes of the app which require widely varying configurations.

    When creating the config that you are going to use for either of the above, you just need to set the parameter for detectionImages on that config. ARCameraManager just happens to do that.
     
  36. davidvjc

    davidvjc

    Joined:
    Mar 12, 2018
    Posts:
    1
    Hey guys - my hit tests seem to go through my UI. So whenever I press a UI button it registers as a hit. I've tried the standard "EventSystem.current.IsPointerOverGameObject" but it seems to have no effect. Any help would be appreciated! ;)
    Thanks.
     
  37. Griffo

    Griffo

    Joined:
    Jul 5, 2011
    Posts:
    652
    Look at my post further up this page ..
     
  38. ThoHer

    ThoHer

    Joined:
    Feb 27, 2018
    Posts:
    10
    Hey there!

    I have a important qustion, depending on - youll guess - the ARKit.

    Iam working with Unity3d, using C# and the ARKit plugin (2.0 from Github)

    In my current application iam using the ARKit for measuring distances. The tool iam creating needs this functionality only for this reason, so i was wondering how i could enable the ARKit, when the user needs the ruler and disable it, if not.

    I want to avoid that there is some performance losing while the user is using a non ARKit Tool. Em i right if i would say, that the ARKit still works in the background, if you where initializing it once? Iam new on the ARKit thing, so i dont have an perfect overview of how to handle it.

    To drop some Code lines makes no sence, its basically the plugin importet into the project, and my own script which depends on some functions - i didnt changed anything in the source code of the plugin. The measuring tool itself which i programmed works pretty well, but i could not determine how to activate and deactivate basically the ARKit.

    Can someone help me out with this? When iam disabeling the GameObjects, the scripts are running at it seems to be a "dirty" method to avoid those functionallitys but i have to make it clean (for excample also the video map in the background needs to be disabled - and i guess those ARKit functions will not be paused or disabled, just because some scripts are disalbed, it seems the api still runs in the background because it lags when i do so)

    If you need more informations, please let me know. Every help or suggestion would be very nice.

    Thanks a lot!
     
    Last edited: Jul 24, 2018
  39. jcarpay

    jcarpay

    Joined:
    Aug 15, 2008
    Posts:
    502
    Like to know this as well. How can I prevent the editor creating ARKitWorldTrackingRemoteConnection?
     
  40. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    730
    It only creates that gameobject in Editor, and it is used to help you easily connect to ARKit Remote from editor. This GameObject will not be created in your final build.
     
    Burglecut and rob_ice like this.
  41. selinkampa

    selinkampa

    Joined:
    May 2, 2018
    Posts:
    2
    Hi all
    I have a problem with the Image Anchor in ARKit. It does not really anchor the models in their origin and the models move even tho the Image does not move. It seems like the model is anchored between the camera and the image. The rotation of the model also changes when you move the camera away from the anchor image. That means the line between the camera and the anchor image is represented by the model.
    Does anyone have a solution for that problem or had a similar problem? I already tried different Unity Versions but that did not solve this problem. I use an iPad Pro with iOS 11.4 and a MacBook Pro 15.
    Thanks in advance for the help.
     
  42. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    154
    I'm struggling with this too, how do I stop generating planes? Also how do I clear all the old planes and start detecting new ones?
     
  43. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    730
    The config and RunWithConfigAndOptions are your friends - look at how they work and you should be able to do both.
     
    Burglecut likes this.
  44. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    730
    You might be seeing a couple of problems - first of all, you should probably try out ARKit 2.0 and iOS 12 latest beta to get better image recognition, and tracking. Secondly, your image definition should include a "physical size" so that ARKit can accurately figure out where your image is.
     
  45. selinkampa

    selinkampa

    Joined:
    May 2, 2018
    Posts:
    2
    Using the Unity Plug In for ARKit 2.0 I got these errors from Xcode when I was trying to build it. I am using Xcode 9.4.1.
    Screen Shot 2018-07-26 at 08.58.25.png
    I assumed that the unit of the physical size is meter so I set it to the length of the side of my printed picture. But this did not change the result either.
     
    Last edited: Jul 26, 2018
  46. ThoHer

    ThoHer

    Joined:
    Feb 27, 2018
    Posts:
    10
    Any ideas about my question above? #2738 ?

    Would be awesome..
     
  47. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    730
    Use XCode 10 beta
     
    rob_ice likes this.
  48. JelmerV

    JelmerV

    Joined:
    Nov 29, 2012
    Posts:
    156
  49. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    730
    Don't think we have an existing thing - you could probably use a spotlight with a cookie texture to do something similar (you might need a special shader that will show the cookie on transparent meshes, similar to the shadow shader).
     
  50. Lelon

    Lelon

    Joined:
    May 24, 2015
    Posts:
    61
    Hey guys,
    I got eyes and tongue tracking demo working on my iPhone X(iOS 12), unity 2018 and Xcode 10.0.
    However when using remote connection, I see video feed, but no eye tracking, only tongue. Anyone know whats the issue?