Search Unity

  1. Get all the Unite Berlin 2018 news on the blog.
    Dismiss Notice
  2. Unity 2018.2 has arrived! Read about it here.
    Dismiss Notice
  3. We're looking for your feedback on the platforms you use and how you use them. Let us know!
    Dismiss Notice
  4. The Unity Pro & Visual Studio Professional Bundle gives you the tools you need to develop faster & collaborate more efficiently. Learn more.
    Dismiss Notice
  5. Improve your Unity skills with a certified instructor in a private, interactive classroom. Learn more.
    Dismiss Notice
  6. ARCore is out of developer preview! Read about it here.
    Dismiss Notice
  7. Magic Leap’s Lumin SDK Technical Preview for Unity lets you get started creating content for Magic Leap One™. Find more information on our blog!
    Dismiss Notice
  8. Want to see the most recent patch releases? Take a peek at the patch release page.
    Dismiss Notice

ARKit support for iOS via Unity-ARKit-Plugin

Discussion in 'ARKit' started by jimmya, Jun 5, 2017.

  1. inihility

    inihility

    Joined:
    Oct 17, 2014
    Posts:
    7
  2. Kevin__h

    Kevin__h

    Joined:
    Nov 16, 2015
    Posts:
    38
    Any idea on the specs needed to run the remote? My macbook pro from late 2013 can't handle it apperently, has a GT 750m, 2GB inside, 16GB of ram and a 2,3Ghz i7. Perhaps together as a community we can figure out what specs are needed! :)
     
  3. MattMurphy

    MattMurphy

    Joined:
    Dec 24, 2013
    Posts:
    109
    For me the remote performance is more dependent on the version of the plug in being used, but can still be temperamental. At first it was pretty poor, then improved quite a bit, and since the spring update seems fairly stable on a 2011 MBP i5 2.4ghz 8gb RAM, intel 500 mb graphics.
     
  4. Mr_Saboteur

    Mr_Saboteur

    Joined:
    Feb 2, 2016
    Posts:
    6
    I'm using the ARAnchor, I try to activate/deactivate content depending on which image ARKit recognize. When the user first recognize an image it triggers AddImageAnchor, and when it recognize the same image again, the event called is UpdateImageAnchor, which makes sense. However, I realized that if I move around in world space, it will also trigger UpdateImageAnchor as ARKit gets more information about the space. I'm trying to figure out how I can manage to know if it updated the Anchor because of new world information or because it recognized the image. Anyone can point me in the right direction?
     
  5. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    677
    @jimmya haven't heard from you for over a week - are you ok
     
  6. inihility

    inihility

    Joined:
    Oct 17, 2014
    Posts:
    7
    I was able to run it on an older 4K iMac (performance was not great) but my current machine is a 2017 iMac with the following specs:
    • 2.3GHz dual-core 7th-generation Intel Core i5 processor
    • Turbo Boost up to 3.6GHz
    • 16GB 2133MHz memory
    • Intel Iris Plus Graphics 640
    The above machine runs it with an acceptable FPS but still get a delay in updating the objects in the scene (I'm assuming it's because data is being fed from the phone). To be fair though I've also had ARKit remote just stop working for a completely different reason and it took a combination of reopening Unity, restarting the phone, etc. to get it functional again.

    Hoping for more updates on remote and seeing it perform better / more stable.
     
  7. Go_Sato

    Go_Sato

    Joined:
    May 21, 2017
    Posts:
    4
    Hi, I am in trouble by Unity ARKit Remote
    I am trying to use the remote function. I succeed in the connection, but the result rendered on the PC is incorrect. After reflecting the image of the camera on the screen for only one frame, incorrect images in the color space are displayed as overlays. The feature point follows the movement of the camera. I want advice
     

    Attached Files:

    rainbladestudios likes this.
  8. Go_Sato

    Go_Sato

    Joined:
    May 21, 2017
    Posts:
    4
    I grasped the clear cause of this problem and corrected it!
    https://zeroichi.hatenablog.jp/entry/2018/04/04/155420
     
    Griffo and aidis_unity08 like this.
  9. scarffy

    scarffy

    Joined:
    Jan 15, 2013
    Posts:
    17
    Hi,

    Is there anyway for me to eliminated or minimize the drifting when using image recognition?
    It happen when I scan the marker and move the camera away. The model seems to follow the camera a little bit.
    I'm trying to map power box model onto a real scale power box.

    Thanks in advance

    Edit: tl;dr. How to properly anchor model so it doesn't appear to move with camera
     
    Last edited: Apr 4, 2018
  10. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    686
    The main thing with image recognition is to provide the correct physical size to ARKit - the more accurate the size to the actual size of the marker, the more accurate the tracking position will be.
     
  11. scarffy

    scarffy

    Joined:
    Jan 15, 2013
    Posts:
    17
    This is the video of what I'm talking about. I'll try to play around with physical size. Thanks @jimmya
     
  12. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    677
    up
     
  13. scarffy

    scarffy

    Joined:
    Jan 15, 2013
    Posts:
    17
    Does this also work if the model is bigger than the image marker itself?
     
  14. Kevin__h

    Kevin__h

    Joined:
    Nov 16, 2015
    Posts:
    38
    The size of the model shouldn't have any impact on the tracking, since all the image is, is a place in the real world to provide a virtual achor, this anchor is placed at the root of your object (position 0,0,0). So if your model is big or not it should still place that anchor on the root of your object and for the tracking calculations only the anchors are considered, not your actual model.
     
  15. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    686
    I debugged this with 11.3 and it appears ARKit does not trigger the didRemoveAnchors delegate in this case. According to my recollection, this was working differently before wasn't it?
     
  16. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    677
    This wasn't working since January at least (I am still on the old ARKit since there are some issues reported for iPhone X in the 11.3 version) - so wondering if it's just forgetting to connect an extern or face removed is actually vestigial on apple's part?
     
  17. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    686
    Yes, I'm trying to find out the answer to this - stay tuned.
     
    ina likes this.
  18. heavy_thebrit

    heavy_thebrit

    Joined:
    Mar 2, 2017
    Posts:
    8
    How are people scaling content using ARKit 1.5? I am using the the image tracking system and we have to define the image dimensions as accurate as possible. I cannot find any scaled content examples within the latest version of the ARKit plugin anymore, and the previous branches are from back in 2017.
     
  19. rocket5tim

    rocket5tim

    Joined:
    May 19, 2009
    Posts:
    226
    I'm using the scaled content example from the Experimental repo that was part of the Unite Austin talk, but it hasn't been updated in months so I suspect it's dead. I've been hoping for some updates on content scaling but still waiting....
     
  20. heavy_thebrit

    heavy_thebrit

    Joined:
    Mar 2, 2017
    Posts:
    8
    After watching the 2018.1 talk I think their focus is getting the systems ready for that release, and integrated into Unity. The AR Utilities are not available to the public in the current beta, which is a shame, but be good to start testing those out soon.

    Right now I think I am going to look into using ARInterface and update ARKit to 1.5. Then try the scaled approach within the examples for ARInterface and see if I can hook into the image recognition as well (should be trivial, just creates anchors).
     
    rocket5tim likes this.
  21. Mr_Saboteur

    Mr_Saboteur

    Joined:
    Feb 2, 2016
    Posts:
    6
    This is great. I tried to use the RemoveUserAnchor, but the object still stays in the scene and the I can't rescan the image afterward. Would you be able to give me some insight on how you get it to work?
     
  22. Octopal

    Octopal

    Joined:
    Nov 27, 2016
    Posts:
    3
    Hi.

    I am having some issues with ARKit Remote.

    I have followed the set up instructions and have checked to see if anyone else here is having the same problem(doesn't look like it). Basically ARKit Remote works fine for the first few seconds but then it crashes every single time usually after some of the first particles are displayed. App just closes and crash log is generated.

    I am building the app with my MacBook Pro High Sierra - v. 10.13.3.
    I am using Unity Version 2017.1.3.1f.
    I have tested the ARKit Remote app with both my iPhone 6s and Gen 5 iPad- Crashes on both..

    Here is the xcode crashlog
     

    Attached Files:

    Last edited: Apr 10, 2018
  23. MSFX

    MSFX

    Joined:
    Sep 3, 2009
    Posts:
    105
    I've found quite a lot recently I keep getting this MarshalDirectionalLightEstimate error... :( It's mostly when a build starts up and my phone is flat on my desk so obviously very little or on light to the camera...

    Anyone else getting it? This is just in the UnityARKitScene and even if I take the light estimation script off the Directional Light it still breaks... :(
     

    Attached Files:

  24. tomasnihlen

    tomasnihlen

    Joined:
    Nov 29, 2017
    Posts:
    2
    Hi!
    I can import the ARkit plugin scene but I can't choose to download the stuff that is for the ARkit remote and som other things as well. I have the latest Unity and Xcode 9.3.

    Could you please help me find what is wrong?
     
  25. Hazneliel

    Hazneliel

    Joined:
    Nov 14, 2013
    Posts:
    116
    Im running Xcode 9.3, Unity 2017.3.1 and asset store ARKit.
    Wont compile, Its throwing these errors:
    "_OBJC_CLASS_$_ARWorldTrackingConfiguration", referenced from:
    objc-class-ref in ARSessionNative.o
    "_OBJC_CLASS_$_AROrientationTrackingConfiguration", referenced from:
    objc-class-ref in ARSessionNative.o
    "_OBJC_CLASS_$_ARAnchor", referenced from:
    objc-class-ref in ARSessionNative.o
    "_OBJC_CLASS_$_ARPlaneAnchor", referenced from:
    objc-class-ref in ARSessionNative.o
    "_OBJC_CLASS_$_ARSession", referenced from:
    objc-class-ref in ARSessionNative.o
     
    Last edited: Apr 11, 2018
  26. Hazneliel

    Hazneliel

    Joined:
    Nov 14, 2013
    Posts:
    116
    I also tried using the master from bitbucket, Xcode 9.3, Unity 2017.3.1
    Setting the build target to 10 because 11 was complaining about not supporting 32-bit targets

    Now I get a bunch of this errors:
    Use of undeclared identifier 'ARImageAnchor'
    Use of undeclared identifier 'ARPlaneDetectionVertical'; did you mean 'UnityARPlaneDetectionVertical'?
    Use of undeclared identifier 'ARTrackingStateReasonRelocalizing'
    Property 'autoFocusEnabled' not found on object of type 'ARWorldTrackingConfiguration *'
    Property 'videoFormat' not found on object of type 'ARWorldTrackingConfiguration *'
    Unknown type name 'ARVideoFormat'
    Unknown type name 'ARPlaneGeometry'

    and goes on...

    Any idea why this is not working? Im not adding anything extra, just added the ARKitpluggin and build to xcode.
    Thanks for any help
     
  27. abstractronchris

    abstractronchris

    Joined:
    Nov 1, 2016
    Posts:
    3
    seems like 11 doesn't support 32-bit targets, check valid architectures in build targets for xcode, was getting upset when I had armv7 in place. try just having arm64 in there.

    edit: no idea about 32-bit targets, can't speak to it but the above has worked for me.
     
  28. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    677
    let me know!
     
  29. Kevin__h

    Kevin__h

    Joined:
    Nov 16, 2015
    Posts:
    38
    No need to remove any valid architectures, just set "Build Active Architecture Only" to "YES". And for the missing references check in your /Libraries/UnityARKitPlugin/Plugins/iOS/UnityARKit/NativeInterface/ARSessionNative.mm if at the top it says #import <ARKit/ARKit.h> and if it does check your Frameworks to see if the ARKit framework is in there. (Also keep in mind that ARKit is only supported on iOS 11 and up, so changing deployment target to iOS 10 will not work)
     
    Burglecut likes this.
  30. tomasnihlen

    tomasnihlen

    Joined:
    Nov 29, 2017
    Posts:
    2
    Hi!

    I have done all the steps but in Unity the Console keeps printing line like this:
    Screen position out of view frustum (screen pos 628.301147, 344.676208) (Camera rect 0 0 1834 1998)
    UnityEngine.SendMouseEvents:DoSendMouseEvents(Int32)

    And I see the "Start Remote ARKit Session" box but nothing happens when I click it.
     
    conradHink likes this.
  31. Kevin__h

    Kevin__h

    Joined:
    Nov 16, 2015
    Posts:
    38
    That error is because you clicked on something out of your camera's field of view. Doesn't really matter you can just ignore it.
    Did you build the ARKit remote in development build to your device? Are you running it? Did you connect your editor to it? Did you add the ARKit remote prefab to your scene? Check all these things, figure out how to do those if you haven't already and then come back if you still have any errors.
     
  32. mlnczk

    mlnczk

    Joined:
    May 12, 2017
    Posts:
    10
    Hi everybody !
    I need some help since im sitting with this problem for quite some time maybe you have any idea. Im at this point working on Project to dynamically paint walls and I need some help detecting size of blue hitbox that is generated on vertical planes we detect. Do you have any clue? I got to the object in hierarchy called GeneratedPlanes there is DebugPlane object attached to it. Im trying to get his LocalScale in Update in one of my classes that tries to generate wall of the size of the hitbox since we can stretch him with our camera. But whatsoever local scale in Debug is all the time 1,1,1 and I can't really get to it. Do you have any clue how to get the size of that blue hitbox?
     
  33. Hazneliel

    Hazneliel

    Joined:
    Nov 14, 2013
    Posts:
    116
    It does have #import <ARKit/ARKit.h> at the top, also the ARKit.framework is there. My target is 11.2

    Still getting errors
    Use of undeclared identifier 'ARImageAnchor'
    Use of undeclared identifier 'ARPlaneDetectionVertical'; did you mean 'UnityARPlaneDetectionVertical'?
    Use of undeclared identifier 'ARTrackingStateReasonRelocalizing'
    Property 'autoFocusEnabled' not found on object of type 'ARWorldTrackingConfiguration *'
    Property 'videoFormat' not found on object of type 'ARWorldTrackingConfiguration *'
    Unknown type name 'ARVideoFormat'
    Unknown type name 'ARPlaneGeometry'


    Any idea?
     
  34. HotSkippy

    HotSkippy

    Joined:
    Apr 10, 2018
    Posts:
    1
    Hi.

    I am having some issues with ARKit Remote.

    I have followed the set up instructions and have checked to see if anyone else here is having the same problem(doesn't look like it). Basically ARKit Remote works fine for the first few seconds but then it crashes every single time usually after some of the first particles are displayed. App just closes and crash log is generated.

    I am building the app with my MacBook Pro High Sierra - v. 10.13.3.
    I am using Unity Version 2017.1.3.1f.
    I have tested the ARKit Remote app with both my iPhone 6s and Gen 5 iPad- Crashes on both..

    Any help would be greatly appreciated. The educational start up that I work for has really hit a wall because of this.
     
  35. Kevin__h

    Kevin__h

    Joined:
    Nov 16, 2015
    Posts:
    38
    Well there you go, your target should be 11.3. ImageAnchors and Vertical Plane detection are new in ARKit 1.5, this requires you to use iOS 11.3.
     
  36. FrankZ0113

    FrankZ0113

    Joined:
    Nov 14, 2017
    Posts:
    2
    Hi there, i got some trouble with the new ARKit 1.5. I have designated and write a AR painting app and it would generate particles as moving the camera. Everything was fine before updating iphone os to ios 11.3. Now when i try to build&run my app, there will be a (lldb) catched and my app cannot detect and activate the main camera. It can only shows a background color at screen as i set ot up in my MainCamera gameobject.
    Here is the bug report from xcode output panel.


    ARpaint was compiled with optimization - stepping may behave oddly; variables may not be available.
    * thread #1, queue = 'com.apple.main-thread', stop reason = signal SIGABRT
    frame #0: 0x00000001818192ec libsystem_kernel.dylib`__pthread_kill + 8
    frame #1: 0x00000001819ba288 libsystem_pthread.dylib`pthread_kill$VARIANT$mp + 376
    frame #2: 0x0000000181787d0c libsystem_c.dylib`abort + 140
    frame #3: 0x0000000180f232c8 libc++abi.dylib`abort_message + 132
    frame #4: 0x0000000180f23470 libc++abi.dylib`default_terminate_handler() + 304
    frame #5: 0x0000000180f4c8e4 libobjc.A.dylib`_objc_terminate() + 140
    frame #6: 0x0000000180f3d37c libc++abi.dylib`std::__terminate(void (*)()) + 16
    frame #7: 0x0000000180f3cccc libc++abi.dylib`__cxa_throw + 132
    * frame #8: 0x0000000101905d78 ARpaint`il2cpp::vm::Exception::Raise(ex=<unavailable>) at Exception.cpp:41 [opt]
    frame #9: 0x0000000101905e18 ARpaint`il2cpp::vm::Exception::RaiseNullReferenceException(msg=<unavailable>) at Exception.cpp:61 [opt]
    frame #10: 0x0000000101905e08 ARpaint`il2cpp::vm::Exception::RaiseNullReferenceException() at Exception.cpp:56 [opt]
    frame #11: 0x0000000100b00318 ARpaint`::paintManager_ARFrameUpdated_m57971470(paintManager_t429985316 *, UnityARCamera_t2069150450, const RuntimeMethod *) [inlined] NullCheck(this_ptr=<unavailable>) at il2cpp-codegen-il2cpp.h:289 [opt]
    frame #12: 0x0000000100b00310 ARpaint`::paintManager_ARFrameUpdated_m57971470(__this=0x00000001106bcf80, ___arCamera0=<unavailable>, method=<unavailable>) at Bulk_Assembly-CSharp_0.cpp:27785 [opt]
    frame #13: 0x0000000100b11b14 ARpaint`::ARFrameUpdate_Invoke_m2222676468(__this=<unavailable>, ___camera0=UnityARCamera_t2069150450 @ 0x000000016f325ca0, method=<unavailable>) at Bulk_Assembly-CSharp_0.cpp:0 [opt]
    frame #14: 0x0000000100b11ab0 ARpaint`::ARFrameUpdate_Invoke_m2222676468(__this=0x00000001106d6d20, ___camera0=UnityARCamera_t2069150450 @ 0x000000016f3260e0, method=0x0000000000000000) at Bulk_Assembly-CSharp_0.cpp:40910 [opt]
    frame #15: 0x0000000100b11ab0 ARpaint`::ARFrameUpdate_Invoke_m2222676468(__this=0x00000001106d6d90, ___camera0=UnityARCamera_t2069150450 @ 0x000000016f326220, method=0x0000000000000000) at Bulk_Assembly-CSharp_0.cpp:40910 [opt]
    frame #16: 0x0000000100b0de70 ARpaint`::UnityARSessionNativeInterface__frame_update_m1185891212(__this=<unavailable>, ___camera0=<unavailable>, method=<unavailable>) at Bulk_Assembly-CSharp_0.cpp:39375 [opt]
    frame #17: 0x0000000100b0dbec ARpaint`::ReversePInvokeWrapper_UnityARSessionNativeInterface__frame_update_m1185891212(___camera0=<unavailable>) at Bulk_Assembly-CSharp_0.cpp:36630 [opt]
    frame #18: 0x000000010106c4c8 ARpaint`::__41-[UnityARSession session:didUpdateFrame:]_block_invoke(.block_descriptor=<unavailable>) at ARSessionNative.mm:815 [opt]
    frame #19: 0x0000000181684b24 libdispatch.dylib`_dispatch_call_block_and_release + 24
    frame #20: 0x0000000181684ae4 libdispatch.dylib`_dispatch_client_callout + 16
    frame #21: 0x00000001816916e0 libdispatch.dylib`_dispatch_main_queue_callback_4CF$VARIANT$mp + 1012
    frame #22: 0x0000000181d3b070 CoreFoundation`__CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 12
    frame #23: 0x0000000181d38bc8 CoreFoundation`__CFRunLoopRun + 2272
    frame #24: 0x0000000181c58da8 CoreFoundation`CFRunLoopRunSpecific + 552
    frame #25: 0x0000000183c3b020 GraphicsServices`GSEventRunModal + 100
    frame #26: 0x000000018bc3978c UIKit`UIApplicationMain + 236
    frame #27: 0x0000000100add690 ARpaint`main(argc=1, argv=0x000000016f3278d0) at main.mm:33 [opt]
    frame #28: 0x00000001816e9fc0 libdyld.dylib`start + 4
     
    Last edited: Apr 13, 2018
  37. jasonmerinobuild

    jasonmerinobuild

    Joined:
    Dec 20, 2017
    Posts:
    2
    Hi, I am working on getting our Unity app, which uses the ARKit plugin, built into our existing iOS app. I have followed a combination of these tutorials and have had success getting it all working when running on a device:

    https://github.com/blitzagency/ios-unity5
    https://the-nerd.be/2015/08/20/a-better-way-to-integrate-unity3d-within-a-native-ios-application/

    The problem I'm facing is that now that I have integrated the Unity app into the existing iOS app I am no longer able to build the iOS app to the simulator. The ARKit functionality is entirely optional in our app so it makes sense to me that we should be able to circumvent the problematic code in ARSessionNative.mm which uses some Metal related types and functions (CVMetalTextureCacheRef, MTLCreateSystemDefaultDevice, CVMetalTextureCacheCreate) but I'm having trouble getting around this.

    I also noticed that there is an option in Unity to build the Unity project with a target SDK of Device or Simulator, but not both, which seems like what I need.

    Is there some setting or some way to get the Unity project to build out an Xcode project that could run on both device and simulator? Or is there a way to flag ARKit functionality as optional so it builds in a way the simulator can compile the code? Or maybe there is a way to stub out the Metal type definitions on the existing iOS app side of things that would allow the application to compile?

    I'm having quite the time trying to piece all this together. Any help on building this in a way that I doesn't break the workflow of our other developers that work on the app but not in Unity would be awesome. Thanks!
     
    Burglecut and michaeltostenson like this.
  38. Hazneliel

    Hazneliel

    Joined:
    Nov 14, 2013
    Posts:
    116
    Thanks, I did realized this was the problem and I wanted to put an update. In my case my work computers cannot yet be updated to high Sierra and use xcode 9.4 beta hence I cannot build using the latest version of the ARKitPluggin.

    What Im doing for the time being and if anyone is in this situation, just pull a tag release from before February. Then it will work.

    Thanks for your help.
     
  39. Kevin__h

    Kevin__h

    Joined:
    Nov 16, 2015
    Posts:
    38
    In xCode you can choose what compiles to what target so just don't let the ARSessionNative compile for the simulator?
    (not sure if this will work but it seems plausible?)
     
    Last edited: Apr 13, 2018
  40. Serhii-Horun

    Serhii-Horun

    Joined:
    Apr 12, 2015
    Posts:
    65
    Hi! We have a bothersome bug in AR mode with ARKit. Can you tell possible reason? Big thanks in advance!
    Crash log from Unity Services:

    UNITY VERSION
    u2017.3.1f1

    OS VERSION
    iOS 11.3

    CPU
    arm64

    Thread 0 (crashed)
    0 libsystem_kernel.dylib 0x00000001824ad2ec __pthread_kill
    1 libsystem_c.dylib 0x000000018241bd0c abort
    2 libc++abi.dylib 0x0000000181bb72c8 __cxa_bad_cast
    3 libc++abi.dylib 0x0000000181bb7458 default_terminate_handler()
    4 libobjc.A.dylib 0x0000000181be08e4 _objc_terminate()
    5 libc++abi.dylib 0x0000000181bd137c std::__terminate(void (*)())
    6 libc++abi.dylib 0x0000000181bd0ccc __cxxabiv1::exception_cleanup_func(_Unwind_Reason_Code, _Unwind_Exception*)
    7 ARKit 0x000000019e38791c acv::math::pose<float, (acv::math::CoordinateSystem)0, (acv::math::CoordinateSystem)0>::pose(cva::SE3GroupStorage<float, cva::Matrix<float, 4u, 4u> > const&, float const&, float const&)
    8 ARKit 0x000000019e37f408 SurfaceDetectionSingleShotSurfaces
    9 ARKit 0x000000019e2e2ca0 +[ARPlaneEstimationTechnique _detectPlanesWithDetector:types:camera:featurePoints:inVisionCoordinates:singleShot:]
    10 ARKit 0x000000019e2e27ec +[ARPlaneEstimationTechnique detectPlanes:withFrame:]
    11 ARKit 0x000000019e2baf20 -[ARFrame _hitTestEstimatedPlanesFromOrigin:withDirection:planeAlignment:]
    12 ARKit 0x000000019e2babe4 -[ARFrame _hitTestFromOrigin:withDirection:types:]
    13 viewer 0x0000000101d149c4 HitTest
    14 viewer 0x0000000100fd9a60 UnityARSessionNativeInterface_HitTest_m4212263537
    15 viewer 0x0000000100d906c0 IOSCamera_HitTestWithResultType_m3573744696
    16 viewer 0x0000000100d90560 IOSCamera_Raycast_m4103644143
    ...
     
  41. Kevin__h

    Kevin__h

    Joined:
    Nov 16, 2015
    Posts:
    38
    Can you debug it in xCode and provide us with the errors xCode gives you?
     
  42. Serhii-Horun

    Serhii-Horun

    Joined:
    Apr 12, 2015
    Posts:
    65
    I have fixed it by checking current AR frame state. If current ar frame state == NormalTracking -> allow HitTest
     
    Burglecut and jimmya like this.
  43. heavy_thebrit

    heavy_thebrit

    Joined:
    Mar 2, 2017
    Posts:
    8
    Looking to enable collisions with the geometry mesh's that are created via the new 1.5 update. Any idea's to get this working?
     
  44. jasonmerinobuild

    jasonmerinobuild

    Joined:
    Dec 20, 2017
    Posts:
    2
    Thanks for the reply Kevin__h. I tried out your suggestion and was unable to get it to ignore the entire folder that I was trying to exclude from the compilation step. So that doesn't seem like a usable solution unless I add all the files in that folder, which would not be very maintainable when doing upgrades to the ARKit plugin.

    Thanks anyway.
     
  45. Fl0oW

    Fl0oW

    Joined:
    Apr 24, 2017
    Posts:
    10
    Not sure if by "blue hitbox" you mean the standard Unity DebugPlane texture, but I assume you do.
    ARPlaneAnchors (which you can get with unityARAnchorManager.GetCurrentPlaneAnchors()) have a field called "extent" which gives you the size of the plane. See also https://developer.apple.com/documentation/arkit/arplaneanchor

    The y component of an extent is always 0 btw, no matter if it is a horizontal or vertical plane. So use the x and z component to calculate your plane's size.
     
    Burglecut likes this.
  46. mlnczk

    mlnczk

    Joined:
    May 12, 2017
    Posts:
    10
    Hi. Is there anyway to input occlusion ? Im working on painting walls and they keep being in front of real world objects. From what I researched ar kit 1.5 doesn't have deep camera such as tango but is there any way to get occlusion ?
     
  47. Kevin__h

    Kevin__h

    Joined:
    Nov 16, 2015
    Posts:
    38
    I don't know if it's an option but you could add virtual walls on the detected planes, using those virtual walls you can have occlusion.
     
  48. Kevin__h

    Kevin__h

    Joined:
    Nov 16, 2015
    Posts:
    38
    Have you tried this? https://stackoverflow.com/questions/38477532/xcode-how-to-exclude-folders-from-compilation
     
  49. auroraDev

    auroraDev

    Joined:
    Nov 22, 2017
    Posts:
    3
    Hi Unity,
    I tried the ARKitRemote and it is working. However, the sync performance is so poor that I can't use it.
    I am using iPhone8 with Mac Pro (3.3GHz i5)
     
  50. ZikW

    ZikW

    Joined:
    Aug 18, 2014
    Posts:
    10