Search Unity

Resolved ARKit support for iOS via Unity-ARKit-Plugin

Discussion in 'AR' started by jimmya, Jun 5, 2017.

Thread Status:
Not open for further replies.
  1. quotidianmusic

    quotidianmusic

    Joined:
    May 13, 2015
    Posts:
    2
    I'm trying this the following way:
    Code (CSharp):
    1.            
    2. UnityEngine.XR.iOS.ARKitWorldTackingSessionConfiguration config = new UnityEngine.XR.iOS.ARKitWorldTackingSessionConfiguration();
    3.  
    4.             if (config.IsSupported) // Only allow AR if device is ARKit capable (iOS 11 or above)
    5.             {
    6.  
    And am getting the following error: EntryPointNotFoundException: IsARKitWorldTrackingSessionConfigurationSupported

    Obviously I'm not supposed to be checking this way. What's the correct way to check if the iOS Device supports ARKit?
     
  2. mm_sto

    mm_sto

    Joined:
    Aug 28, 2017
    Posts:
    16
    Ah cool! Did the baked shadow come included as a texture, or did you have a specific process on capturing it?
     
  3. Dennis_eA

    Dennis_eA

    Joined:
    Jan 17, 2011
    Posts:
    380
    Texture. Like I said, just take a look at the asset. :)
     
  4. UCS

    UCS

    Joined:
    Nov 7, 2014
    Posts:
    4
    To rephrase, I am never creating an additional session, I am never destroying the session or pausing the session. The problem occurs when you Restart the session while the session is initializing.
     
  5. SweatyChair

    SweatyChair

    Joined:
    Feb 15, 2016
    Posts:
    140
    I have the same problem and want to run my game in iPhoneX.... any workaround to disable ARKit in simulator for now?
     
  6. christophergoy

    christophergoy

    Joined:
    Sep 16, 2015
    Posts:
    735
    Hmm, looks like a function name might be incorrect. I'll take a look and update you. Thanks for the info.
     
    quotidianmusic likes this.
  7. DeepMotionPhysics

    DeepMotionPhysics

    Joined:
    Jul 31, 2014
    Posts:
    243
    Pulled the HEAD of the ARKit plug-in from GitHub and tried to build the default ARKit demo scene using XCode-9 for iPhone 8+ with the following link error:


    Showing Recent Messages

    Ld /Users/Kevin/Library/Developer/Xcode/DerivedData/Unity-iPhone-dhinslkzvgvezladxgtdcjbwwccb/Build/Products/ReleaseForRunning-iphoneos/ProductName.app/ProductName normal arm64

    cd /Users/Kevin/Source/Unity-ARKit-Plugin/Build/UnityARKitScene

    export IPHONEOS_DEPLOYMENT_TARGET=6.0

    export PATH="/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/usr/bin:/Applications/Xcode.app/Contents/Developer/usr/bin:/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin"

    /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/clang++ -arch arm64 -isysroot /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS11.0.sdk -L/Users/Kevin/Library/Developer/Xcode/DerivedData/Unity-iPhone-dhinslkzvgvezladxgtdcjbwwccb/Build/Products/ReleaseForRunning-iphoneos -L/Users/Kevin/Source/Unity-ARKit-Plugin/Build/UnityARKitScene -L/Users/Kevin/Source/Unity-ARKit-Plugin/Build/UnityARKitScene/Libraries -F/Users/Kevin/Library/Developer/Xcode/DerivedData/Unity-iPhone-dhinslkzvgvezladxgtdcjbwwccb/Build/Products/ReleaseForRunning-iphoneos -filelist /Users/Kevin/Library/Developer/Xcode/DerivedData/Unity-iPhone-dhinslkzvgvezladxgtdcjbwwccb/Build/Intermediates.noindex/Unity-iPhone.build/ReleaseForRunning-iphoneos/Unity-iPhone.build/Objects-normal/arm64/ProductName.LinkFileList -Xlinker -map -Xlinker /Users/Kevin/Library/Developer/Xcode/DerivedData/Unity-iPhone-dhinslkzvgvezladxgtdcjbwwccb/Build/Intermediates.noindex/Unity-iPhone.build/ReleaseForRunning-iphoneos/Unity-iPhone.build/ProductName-LinkMap-normal-arm64.txt -miphoneos-version-min=6.0 -dead_strip -Xlinker -object_path_lto -Xlinker /Users/Kevin/Library/Developer/Xcode/DerivedData/Unity-iPhone-dhinslkzvgvezladxgtdcjbwwccb/Build/Intermediates.noindex/Unity-iPhone.build/ReleaseForRunning-iphoneos/Unity-iPhone.build/Objects-normal/arm64/ProductName_lto.o -fembed-bitcode-marker -stdlib=libc++ -fobjc-arc -fobjc-link-runtime -weak_framework CoreMotion -weak-lSystem -framework MediaToolbox -liPhone-lib -framework CoreText -framework AudioToolbox -weak_framework AVFoundation -framework CFNetwork -framework CoreGraphics -framework CoreLocation -framework CoreMedia -weak_framework CoreMotion -framework CoreVideo -framework Foundation -framework MediaPlayer -framework OpenAL -framework OpenGLES -framework QuartzCore -framework SystemConfiguration -framework UIKit -liconv.2 -lil2cpp -weak_framework Metal -Xlinker -dependency_info -Xlinker /Users/Kevin/Library/Developer/Xcode/DerivedData/Unity-iPhone-dhinslkzvgvezladxgtdcjbwwccb/Build/Intermediates.noindex/Unity-iPhone.build/ReleaseForRunning-iphoneos/Unity-iPhone.build/Objects-normal/arm64/ProductName_dependency_info.dat -o /Users/Kevin/Library/Developer/Xcode/DerivedData/Unity-iPhone-dhinslkzvgvezladxgtdcjbwwccb/Build/Products/ReleaseForRunning-iphoneos/ProductName.app/ProductName



    Undefined symbols for architecture arm64:

    "_OBJC_CLASS_$_ARWorldTrackingConfiguration", referenced from:

    objc-class-ref in ARSessionNative.o

    "_OBJC_CLASS_$_AROrientationTrackingConfiguration", referenced from:

    objc-class-ref in ARSessionNative.o

    "_OBJC_CLASS_$_ARAnchor", referenced from:

    objc-class-ref in ARSessionNative.o

    "_OBJC_CLASS_$_ARPlaneAnchor", referenced from:

    objc-class-ref in ARSessionNative.o

    "_OBJC_CLASS_$_ARSession", referenced from:

    objc-class-ref in ARSessionNative.o

    ld: symbol(s) not found for architecture arm64

    clang: error: linker command failed with exit code 1 (use -v to see invocation)
     
  8. christophergoy

    christophergoy

    Joined:
    Sep 16, 2015
    Posts:
    735
    Which version of Xcode are you on? sounds like it’s out of date.
     
  9. DeepMotionPhysics

    DeepMotionPhysics

    Joined:
    Jul 31, 2014
    Posts:
    243
    XCode 9.0 on macOS 10.13 (High Sierra)
     
  10. DeepMotionPhysics

    DeepMotionPhysics

    Joined:
    Jul 31, 2014
    Posts:
    243
    The build error went away when I update the User profile, BundleID settings and switch to "Development" build in Unity Player Settings.

    The UnityARKitScene.unity was launched on the iPhone8+. However it immediately crashed with the following console message:

    2017-10-12 20:02:12.220243-0700 UnityARKitScene[483:161357] [MC] Reading from public effective user settings.

    2017-10-12 20:02:12.248128-0700 UnityARKitScene[483:161234] [access] This app has crashed because it attempted to access privacy-sensitive data without a usage description. The app's Info.plist must contain an NSCameraUsageDescription key with a string value explaining to the user how the app uses this data.
     
  11. DeepMotionPhysics

    DeepMotionPhysics

    Joined:
    Jul 31, 2014
    Posts:
    243
    The crash was fixed after filling in a "camera usage description" into the "Player Settings" of Unity Editor. Now the demo scene runs well ! We should put in these little steps in a setup HowTo document of the ARKit plugin.

     
  12. aximsat

    aximsat

    Joined:
    Oct 12, 2017
    Posts:
    8
    is there possibility to detect real objects between camera and ARObjects in arkit?
     
  13. Green-VR

    Green-VR

    Joined:
    Apr 15, 2013
    Posts:
    12
    Hi,
    I have a question regarding ARkit and when you first start the demo app.
    As soon as you start it sets the cameras point in real space as vector position x0y0z0.
    Normally when you scan the ground around you is position Y value is something like -1.4
    What I'm looking for is to set ground Y posistion to equal 0 as if you started from that point. This way your phone's Y position would be 1.4
    Is there a way to set this with out place your phone as close to the ground as possable?
    I,m working with muilty player funtions and being able to do this would sort out syncing issues between phones.
     
  14. Green-VR

    Green-VR

    Joined:
    Apr 15, 2013
    Posts:
    12
    Hi All,
    Quick qustion... What is the best was to sync ARCamera positions for multiplayer setup?
     
  15. christophergoy

    christophergoy

    Joined:
    Sep 16, 2015
    Posts:
    735
    There is already an ARKit FAQ thread which answers these questions here :)
     
  16. stuartlangfield

    stuartlangfield

    Joined:
    Jul 21, 2017
    Posts:
    13
    Anyone having issues with scene orientation?

    Building an AR platformer test. Generally when I place the scene onto a real world surface using HitCube, it's rotated 180 degrees around, therefore reversing my character controls etc.
     
  17. christophergoy

    christophergoy

    Joined:
    Sep 16, 2015
    Posts:
    735
    The hitCube my be orienting the gameobjects to the direction of the plane you are placing them on. You will either have to modify that code, or write your own placement code based on how you want to orient your content. If you want your content to face the same direction of the camera, you can use the X and Z components of the camera's forward vector to create your contents forward vector.
     
  18. flapy-monkey

    flapy-monkey

    Joined:
    Dec 23, 2014
    Posts:
    7
    It's a really good plugin, good job guys!
    if you can add the directional lighting environment for detected face, that could be perfect!
    thx anyway!!!
     
  19. badmiral

    badmiral

    Joined:
    Oct 14, 2017
    Posts:
    1
    Awesome plugin and thanks for all the support so far! When building with the stable Xcode 9 release, I keep getting this error: UI API called from background thread: -[UIApplication delegate] must be used from main thread only.

    I haven't implemented anything in my unity project yet other than the AR Camera, just wondering if this is something I should be concerned about before I get too far.
     
  20. christophergoy

    christophergoy

    Joined:
    Sep 16, 2015
    Posts:
    735
    this is a known issue and shouldn't be an issue for your app. Let me know if you run into a showstopper with it.
     
    badmiral likes this.
  21. phits

    phits

    Joined:
    Aug 31, 2010
    Posts:
    41
    I am going through a class on UDACity and came across the following error with the AR lighting video. Any clues on how to fix it?

    Assets/UnityARKitPlugin/Examples/UnityARKitScene/AREnvironmentLighting.cs(10,33): error CS0117: `UnityEngine.XR.iOS.UnityARSessionNativeInterface' does not contain a definition for `ARFrameUpdateEvent'

    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4. using UnityEngine.XR.iOS;
    5.  
    6. public class AREnvironmentLighting : MonoBehaviour {
    7.  
    8.     // Use this for initialization
    9.     void Start () {
    10.         UnityARSessionNativeInterface.ARFrameUpdateEvent += UpdateAmbientIntensity;
    11.     }
    12.  
    13.     // Update is called once per frame
    14.     void UpdateAmbientIntensity (UnityARCamera camera) {
    15.         // Convert ARKit intensity to Unity intensity
    16.         // ARKit ambient intensity ranges 0-2000
    17.         // Unity ambient intensity ranges 0-8 (for over-bright lights)
    18.         float newLight = camera.lightEstimation.ambientIntensity /1000f;
    19.         RenderSettings.ambientLight =Color.white * newLight;
    20.     }
    21. }
    22.  
    https://www.udacity.com/course/learn-arkit--nd114

    Thanks,

    Fitz
     
  22. GalxyzStudios

    GalxyzStudios

    Joined:
    Jul 15, 2015
    Posts:
    31
    Hey, I just downloaded Unity AR Kit plugin from asset store. I am getting errors when trying to build in Xcode project. I have set Deployment target to iOS 11. Screenshot of compile error attached.
    Environment
    Unity 2017.1.0f3
    Xcode Version 9.1 beta (9B46)
    iPhone 6 Plus
    Screen Shot 2017-10-16 at 12.10.37 PM.png
     
    Last edited: Oct 16, 2017
  23. jimmmmy

    jimmmmy

    Joined:
    Feb 17, 2014
    Posts:
    27
    Wow~ That's really impressive. How did you make the video recorder?
     
  24. mm_sto

    mm_sto

    Joined:
    Aug 28, 2017
    Posts:
    16
  25. christophergoy

    christophergoy

    Joined:
    Sep 16, 2015
    Posts:
    735
    Looks like a typo: ARFrameUpdatedEvent is the correct event.
     
  26. virgilcwyile

    virgilcwyile

    Joined:
    Jan 31, 2016
    Posts:
    73
    I have a weird issue. All the example scenes are stuck on "
    Start Remote ARKit Session
    "
    It keeps showing this from the very start. When I click on it, I can see the Camera starts to lag on the iPhone. It only ran once, and ever since then I only see this message on top and nothing shows on the scene. I selected device many times but doesnt work. Any work around?
     
  27. GalxyzStudios

    GalxyzStudios

    Joined:
    Jul 15, 2015
    Posts:
    31
    @christophergoy do you have any idea about this issue ??
     
  28. christophergoy

    christophergoy

    Joined:
    Sep 16, 2015
    Posts:
    735
    Make sure you have the correct scene selected in the build settings window
     
  29. christophergoy

    christophergoy

    Joined:
    Sep 16, 2015
    Posts:
    735
    Hey, I'm not sure. Does the new Beta have a different definition of the Configuration?
     
  30. stuartlangfield

    stuartlangfield

    Joined:
    Jul 21, 2017
    Posts:
    13
    Thanks for the tip. Think I've solved it by forcing the scene to always face the camera when it detects a new tap.

    Code (CSharp):
    1. // Face the camera (assume it's level and rotate around y-axis only)
    2. m_HitTransform.LookAt (Camera.main.transform.position);
    3. m_HitTransform.eulerAngles = new Vector3 (0, m_HitTransform.eulerAngles.y, 0);
     
    sawstrategic likes this.
  31. stuartlangfield

    stuartlangfield

    Joined:
    Jul 21, 2017
    Posts:
    13
    Anyone tried to add pinch scene scaling functionality to any of their tests yet?

    I'm testing with a 3D platformer. Looking to add a UI button or function at the start of the game that allows the user to pinch to scale the scene to a size that better fits the surface they are playing on.
     
  32. flyingdeutschman

    flyingdeutschman

    Joined:
    Jul 26, 2013
    Posts:
    8
    Hey all,
    Can you recommend a way to increase the light sensitivity? I've seen some apps that seem to be more forgiving of low light then the defaults in the plugin.
    Thanks!
     
  33. chrisvarnsverryvirtualarts

    chrisvarnsverryvirtualarts

    Joined:
    May 18, 2017
    Posts:
    3
    Hi guys,
    I'm looking at the scaled content branches to see if they can help us solve our content scaling issues, but I'm finding it non-obvious how to integrate into our content. The example scene appears to run fine. Are there any instructions for how to set this up in our own scenes?
    Now that I've posted this, guaranteed I will figure out what I'm missing.
    Thanks!
     
  34. nickfourtimes

    nickfourtimes

    Joined:
    Oct 13, 2010
    Posts:
    219
    I've got my main camera (depth: -1) rendering the scene normally, and a second camera (depth: 1) rendering to a rendertexture, which is then displayed on a "TV screen" in the game. This works fine in standard mode, but with the scene in AR, the TV screen only seems to render some background particles, not the scene itself.

    Does the ARKit plugin change camera depths, or otherwise affect layers/rendering order in such a way that would affect rendertextures like this?
     
  35. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Have a look at https://forum.unity.com/threads/scaled-content.500620/ - hopefully that gives you some idea of what it is.
     
  36. smallbit

    smallbit

    Joined:
    Oct 3, 2013
    Posts:
    60
    I did tests with slider to change scale, I used the alt scale branch. With slider it works just fine, I am planning to have two settings for that, tabletop and floor with different scales respectively which user can choose from at start. Also looking into automatic detection based on distance i.e. < 1m would be a table top etc..
     
  37. smallbit

    smallbit

    Joined:
    Oct 3, 2013
    Posts:
    60
    I used alt-scaled-content branch. From the example scene I extracted the ContentCameraParent as prefab. Besides that I changed the HitTestWithResultType method so now when I place my ref point by m_HitTransform.position = UnityARMatrixOps.GetPosition(hitResult.worldTransform); I also do cameraScaler.scaledObjectOrigin = UnityARMatrixOps.GetPosition(hitResult.worldTransform); for the initial spawning. The rest works fine. Also check the layer numbers they have there for ScaledContent and ARKitTrackingData and set them to your objects like in the example (all 3d content you want to scale needs to be in the ScaledContent layer etc...), last thing to check if you just merge it to your scene is to set the culling masks on both cameras.
     
    Last edited: Oct 18, 2017
  38. TijsVdV

    TijsVdV

    Joined:
    Mar 7, 2017
    Posts:
    32
    Ya i know but i have no clue how to solve it . So my issue is we have a gun that is attached to the camera, so when we move around in ar the gun follows the camera. On ipad the gun is positioned more backwards and on iphone it is positioned more forward, it isnt the same as in the editor. Both cameras have the same FOV. Is there a way to calculate a same distance on all devices? So they all look the same.

    I just came to the conclusion that everything in the edtior and on an iphone are the same. Only on the ipad it is different.
     
    Last edited: Oct 18, 2017
  39. LondonStevey

    LondonStevey

    Joined:
    Jun 26, 2015
    Posts:
    33
    Woow! This plugin blows my mind! Its awesome! I've been playing with the balls scene. walking round the flat into different rooms and placing balls on surfaces, moving back into the rooms and they're still perfectly in place, its soo good!
    I then go into a room, put it in my pocket, and walk through 3 more rooms, and in a couple of seconds its recalibrated back to my position again!
    Starting to thing of ways that you could share point clouds to have multiplayer spacially aware apps..
    One thing i really hope ARKit includes at some point is the vertical planes, as the balls in different rooms are still fully visible, I guess there's no way of dealing with that until vertical planes are added? Unless you test to see if there's a point from the point cloud between you and the gameobject I suppose.
    Great work, thanks! :)
     
    jimmya likes this.
  40. bboydaisuke

    bboydaisuke

    Joined:
    Jun 14, 2014
    Posts:
    67
    Hello,

    I've been trying make UnityARGeneratePlane generates debugPlanePrefab, but never make it work. I expect I can see planes with blue-edge. But I have never seen the edged plane in any samples like UnityARKitScene, UnityARBalls, and FocusSquareSample.
    ParticlePainter works. PointCloudParticleExample works too. I can connect remote session with UnityARKitRemote. However, I just can't make UnityARGeneratePlane work.

    Please help me to make it work.

    Environment:
    • macOS Sierra 10.12.6
    • Unity 2017.2.0f3
    • Unity AR Kit (cloned default branch which is last updated on 10th, Oct, 2017)
    • iPhone 6s
    • iOS 11.0.3
     
  41. Smarnovative

    Smarnovative

    Joined:
    Oct 7, 2016
    Posts:
    2
    Unity arKit is taking a lot of time in surface detection. Could it be improved?
     
  42. Pilot-365

    Pilot-365

    Joined:
    Aug 9, 2017
    Posts:
    11
    I have UI buttons in the UnityARKitScene but when I run the app and go to press on the buttons, a lot of times it will cause the HitCubeParent's child to reorient and transform its position because of this new touch input making for an erratic experience if interacting with the UI buttons. Is there a way to restrict the areas on my iPhone screen where my touch input will affect the HitCubeParent?

    Here's a GIF of the issue. In it, I have the HitCube in the scene but when I go to press on the (+) button it also activates the repositioning of the HitCube which I don't want. https://ibb.co/hJ2Bkm

    SOLVED: Using this YouTube tutorial, I edited the ARHitTest script to read as the following:

    using System;
    using System.Collections.Generic;
    using UnityEngine.EventSystems;

    namespace UnityEngine.XR.iOS
    {
    public class UnityARHitTestExample : MonoBehaviour
    {
    public Transform m_HitTransform;

    bool HitTestWithResultType (ARPoint point, ARHitTestResultType resultTypes)
    {
    List<ARHitTestResult> hitResults = UnityARSessionNativeInterface.GetARSessionNativeInterface ().HitTest (point, resultTypes);
    if (hitResults.Count > 0) {
    foreach (var hitResult in hitResults) {
    Debug.Log ("Got hit!");
    m_HitTransform.position = UnityARMatrixOps.GetPosition (hitResult.worldTransform);
    m_HitTransform.rotation = UnityARMatrixOps.GetRotation (hitResult.worldTransform);
    Debug.Log (string.Format ("x:{0:0.######} y:{1:0.######} z:{2:0.######}", m_HitTransform.position.x, m_HitTransform.position.y, m_HitTransform.position.z));
    return true;
    }
    }
    return false;
    }

    // Update is called once per frame
    void Update () {
    if (!EventSystem.current.IsPointerOverGameObject (Input.GetTouch (0).fingerId)) {

    if (Input.touchCount > 0 && m_HitTransform != null) {
    var touch = Input.GetTouch (0);
    if (touch.phase == TouchPhase.Began || touch.phase == TouchPhase.Moved) {
    var screenPosition = Camera.main.ScreenToViewportPoint (touch.position);
    ARPoint point = new ARPoint {
    x = screenPosition.x,
    y = screenPosition.y
    };

    // prioritize reults types
    ARHitTestResultType[] resultTypes = {
    ARHitTestResultType.ARHitTestResultTypeExistingPlaneUsingExtent,
    // if you want to use infinite planes use this:
    //ARHitTestResultType.ARHitTestResultTypeExistingPlane,
    //ARHitTestResultType.ARHitTestResultTypeHorizontalPlane,
    //ARHitTestResultType.ARHitTestResultTypeFeaturePoint
    };

    foreach (ARHitTestResultType resultType in resultTypes) {
    if (HitTestWithResultType (point, resultType)) {
    return;
    }
    }
    }
    }

    }

    }

    }
    }
     
    Last edited: Oct 23, 2017
  43. hungrybelome

    hungrybelome

    Joined:
    Dec 31, 2014
    Posts:
    336
    I noticed that the surface detection in the example ARKit scenes is much slower/less effective than the ARKit apps released on the App Store (e.g. AR Dragon). Could this be due to those App Store ARKit apps being native apps instead of Unity? I'm wondering if Unity ARKit's surface detection is inherently less effective than native ARKit at the moment, or if maybe there are parameters that can be tweaked to gain the same effectiveness of the App Store apps. It seems like
    UnityARSessionNativeInterface.ARAnchorAddedEvent just takes forever to be called.

    I was thinking about making a thread about it soon, showing example screen shots where the example Unity ARKit scenes fail to detect a surface that other released ARKit apps detect quickly.
     
    mrjman and GeckoTrader like this.
  44. mm_sto

    mm_sto

    Joined:
    Aug 28, 2017
    Posts:
    16
    This is a great example of an alternative solution
     
  45. Pilot-365

    Pilot-365

    Joined:
    Aug 9, 2017
    Posts:
    11
    That's a great tutorial and is actually the tutorial I followed to get started on a lot of the scripting behind my UI elements. I'm going to try to alter the HitTest script so that touch input is voided if a raycast hits my UI elements (which if anyone knows how to do that that'd be of huge help since I'm newish to this stuff). I'll post an update if it works.
     
  46. aschau24

    aschau24

    Joined:
    Sep 1, 2017
    Posts:
    1
    I've been working with the plug-in for the last two weeks, but am constantly running into an issue with the tracking of objects that I'm instantiating into the scene. For some reason, none of the objects in my scene stay in place except for the original planes generated by the ARKit plane generator. When I shift my phone up, all the objects move up along with the camera. I've also tried to generate a large plane as a floor in the same exact location as a plane that ARKit generates, but it seems to always be offset upwards in the Y direction. Any idea as to what may be causing this?
     
  47. altunhans

    altunhans

    Joined:
    Feb 22, 2017
    Posts:
    2
    Hi
    I am working on a new IOS app with Unity ArKit. I have a problem with touch screen and UI button. Whenever I press a button, It also counts the touch as on the detected plane. How can I solve this problem ?
    Thank you for your help.
     
  48. 4wall

    4wall

    Joined:
    Sep 16, 2014
    Posts:
    73
    Hi. Anyone know if there is a way to test ARkit projects on the desktop? This is easy to do using Vuforia.
     
  49. Pilot-365

    Pilot-365

    Joined:
    Aug 9, 2017
    Posts:
    11
    Not sure of a way to do so on the desktop but if you're building ARKit in Unity, try ARKit Remote - it'll let you test your AR app and have the ability to live edit without having to build it and run through Xcode.
     
  50. Pilot-365

    Pilot-365

    Joined:
    Aug 9, 2017
    Posts:
    11
    I am having the same issue and posted the problem above. Someone suggested using raycasting where if a raycast hits a UI element, it voids the input touch within the ARKit Hit cs script. Still trying to figure out how to do that though. If you want to try this concept and it works for you, I'd be grateful for any assistance

    SOLVED: Using this tutorial, edited the ARHitTest script to read:

    using System;
    using System.Collections.Generic;
    using UnityEngine.EventSystems;

    namespace UnityEngine.XR.iOS
    {
    public class UnityARHitTestExample : MonoBehaviour
    {
    public Transform m_HitTransform;

    bool HitTestWithResultType (ARPoint point, ARHitTestResultType resultTypes)
    {
    List<ARHitTestResult> hitResults = UnityARSessionNativeInterface.GetARSessionNativeInterface ().HitTest (point, resultTypes);
    if (hitResults.Count > 0) {
    foreach (var hitResult in hitResults) {
    Debug.Log ("Got hit!");
    m_HitTransform.position = UnityARMatrixOps.GetPosition (hitResult.worldTransform);
    m_HitTransform.rotation = UnityARMatrixOps.GetRotation (hitResult.worldTransform);
    Debug.Log (string.Format ("x:{0:0.######} y:{1:0.######} z:{2:0.######}", m_HitTransform.position.x, m_HitTransform.position.y, m_HitTransform.position.z));
    return true;
    }
    }
    return false;
    }

    // Update is called once per frame
    void Update () {
    if (!EventSystem.current.IsPointerOverGameObject (Input.GetTouch (0).fingerId)) {

    if (Input.touchCount > 0 && m_HitTransform != null) {
    var touch = Input.GetTouch (0);
    if (touch.phase == TouchPhase.Began || touch.phase == TouchPhase.Moved) {
    var screenPosition = Camera.main.ScreenToViewportPoint (touch.position);
    ARPoint point = new ARPoint {
    x = screenPosition.x,
    y = screenPosition.y
    };

    // prioritize reults types
    ARHitTestResultType[] resultTypes = {
    ARHitTestResultType.ARHitTestResultTypeExistingPlaneUsingExtent,
    // if you want to use infinite planes use this:
    //ARHitTestResultType.ARHitTestResultTypeExistingPlane,
    //ARHitTestResultType.ARHitTestResultTypeHorizontalPlane,
    //ARHitTestResultType.ARHitTestResultTypeFeaturePoint
    };

    foreach (ARHitTestResultType resultType in resultTypes) {
    if (HitTestWithResultType (point, resultType)) {
    return;
    }
    }
    }
    }

    }

    }

    }
    }
     
    Last edited: Oct 23, 2017
    marcipw likes this.
Thread Status:
Not open for further replies.