Search Unity

ARKit 2.0 Beta support

Discussion in 'AR' started by jimmya, Jun 5, 2018.

  1. cutewen

    cutewen

    Joined:
    Jan 14, 2018
    Posts:
    2
    @jimmya
    Hi, i have an issue when run your demo , xcodeproject and xcode build were all fine, but when i run on my iphone, something went wrong, i hope you can give me some suggestions, thanks!

    i use iphonex(ios12.1) xcode10.1beta3 unity2018.3.0beta
    and the error like:

    upload_2018-10-29_22-12-43.png
    upload_2018-10-29_22-13-16.png
     
  2. Aiursrage2k

    Aiursrage2k

    Joined:
    Nov 1, 2009
    Posts:
    4,835
    Has anyone figured out a good solution for handling relocalization? Trying to find the anchor again can be a real pain, the other problem is it seems that the objects will not actually be placed in the same position as they were when spawned. I was trying to create some kind of presistent running AP with a "checkpoint system" in a circular path at the gym -- but upon relocalization the points were actually unreachable in the wall
     
  3. tkrikorian

    tkrikorian

    Joined:
    Jan 2, 2018
    Posts:
    4
    @Aiursrage2k i'm in the same boat as yours. I'm saving the id of the anchor in a file and the corresponding ARWorldMap but when trying to do relocalization the position is not the same. Anyone got a working example of saving and loading anchors ?
     
  4. whogas

    whogas

    Joined:
    Oct 18, 2013
    Posts:
    46
    Has anyone had issues with UnityNativeSessionARInterface.ARAnchorRemovedEvent causing crashes? Seems to be consistently happening in my multiplayer project whenever this event is triggered. I have sought out all instances where the event triggers something and have turned them all off except coming from ARKit remote, which I am not using.

    Any ideas/help?
     
  5. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Do you have latest? We might have fixed something along these lines recently. If you have latest, would you be able to tell us what the console log or stacktrace looks like in XCode?
     
    whogas likes this.
  6. whogas

    whogas

    Joined:
    Oct 18, 2013
    Posts:
    46
    I went and got the latest, and it sure seems like it fixed the problem I was having with the anchor remove event. Thanks for the response and the update!
     
  7. zwcloud

    zwcloud

    Joined:
    Mar 15, 2016
    Posts:
    377
    UPDATE:
    Then, how to restore the anchors' position after exiting the app?
    I tried:
    1. Save GameObjects' position to
    Application.persistentDataPath + "markers.json"
    when world map is "mapped". And, at the same time, attach
    UnityARUserAnchorComponent
    to those GameObjects.
    2. At the same time, save the worldmap to
    Application.persistentDataPath + "worldmap.dat"
    .
    3. Restart the app
    4. Read worldmap.dat, scan the scene to make it mapped.
    5. Once it mapped, read markers.json. Create GameObjects at the saved positions and attach
    UnityARUserAnchorComponent
    to those GameObjects.

    But the positions seem to be relative to the initial camera position instead of the worldmap.

    I'm reading the code of
    UnityARUserAnchorComponent
    , and I cannot understand how it keeps the anchor ID persistent after restarting the app, recreating the GameObjects, attaching a brand-new
    UnityARUserAnchorComponent
    .

    I have known
    UnityARUserAnchorComponent.GameObjectAnchorUpdated
    will adjust the transform of the GameObject that
    UnityARUserAnchorComponent
    attached to. But it was seldom called. And once it was called, the GameObject was moved to (0,0,0).
     
    Last edited: Nov 27, 2018
  8. dorukeker

    dorukeker

    Joined:
    Dec 6, 2016
    Posts:
    37
    ARKit Camera Feed Very Dark / Low Exposure

    Hello All,

    I am building an ARKit application via the Unity ARKit plugin. However the camera feed is very dark compared to other ARKit applications. (e.g. Measure from Apple).

    Is there a way to make the camera feed brigther / more exposure?
    Thanks!
    Doruk
     
  9. zwcloud

    zwcloud

    Joined:
    Mar 15, 2016
    Posts:
    377
    I'm exactly doing this.

    There is no way to dump the world map as a string to list the anchor saved in the worldmap. So there is no way to inspect whether the worldmap contains the user anchors.

    @jimmya

    UPDATE:
    I just printed the anchor identifiers for the loaded world map in ARWolrdmap.mm:

    Code (csharp):
    1. void worldMap_Dump(ARWorldMap* worldMap)
    2. {
    3.     if (worldMap == nullptr || !worldMap_GetSupported())
    4.         return;
    5.  
    6.     NSLog(@"Anchors in the worldMap:");
    7.     for (ARAnchor* anchor in worldMap.anchors) {
    8.         NSLog([anchor.identifier UUIDString]);
    9.     }
    10. }
    11.  
    12. void* worldMap_Load(const char* path)
    13. {
    14.     if (!worldMap_GetSupported())
    15.         return nullptr;
    16.  
    17.     NSError* error = nil;
    18.     NSURL* url = [[NSURL alloc] initFileURLWithPath:[NSString stringWithUTF8String:path] isDirectory:false];
    19.     NSData *wmdata = [NSData dataWithContentsOfURL:url options:NSDataReadingMappedAlways error:&error];
    20.  
    21.     if (error)
    22.         NSLog(@"%@", error);
    23.  
    24.     ARWorldMap* worldMap = [NSKeyedUnarchiver unarchiveObjectWithData:wmdata];
    25.  
    26.     worldMap_Dump(worldMap);//prints all anchors
    27.  
    28.     return (__bridge_retained void*)worldMap;
    29. }
    30.  
    An saw the anchors when loading the world map:
    But
    UnityARSessionNativeInterface.ARUserAnchorAddedEvent
    did not get invoked when this world map was loaded. What's the possible reason?

    Created an issue for this: https://bitbucket.org/Unity-Technologies/unity-arkit-plugin/issues/91

    UPDATE
    Solved. Cause: the new "restarted" session for loaded worldmap is started before
    UnityARCameraManager.Start()
    , which started the initial session. Thus conflicted.
    @tkrikorian
     
    Last edited: Dec 6, 2018
  10. tkrikorian

    tkrikorian

    Joined:
    Jan 2, 2018
    Posts:
    4
    Having the same issue as @zwcloud. I'm keeping the id of an anchor but it doesn't get created or it is misplaced when loading the worldmap. Would love an answer on how ARUserAnchor are supposed to work because so far for me it doesn't work as intended @jimmya.
     
  11. steveEXC

    steveEXC

    Joined:
    Nov 7, 2014
    Posts:
    11
    Hey guys, I'm having an issue where upon sharing world map data with a second phone, the second phone crashes if the first phone had scanned a wall. I don't get any log messages in XCode or any clue as to why this would be happening. I'm still looking into what could be causing this, but I was wondering if any one else is encountering this kind of behavior?
     
    whogas likes this.
  12. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Are you maybe using a different configuration when saving vs loading? i.e. one does both vertical and horizontal plane detection and one does only horizontal maybe?
     
    okn200 likes this.
  13. okn200

    okn200

    Joined:
    Dec 7, 2015
    Posts:
    3
    I'm having issues with the image anchor. I've made an AR business card with a 3d asset on one side and video on a plane on the other side. It tracks the card and keeps the assets matched up on it but when I remove the card from view the assets remain stuck in the scene. Is there a way to only have the assets appear when the image marker is there and then destroy when it disappears? Like how Vuforia's extended tracking works?

    Thank you!!
     
    Last edited: Dec 6, 2018
  14. dustin_red

    dustin_red

    Joined:
    Feb 7, 2018
    Posts:
    46
    @jimmya Could you explain the ARFaceAnchor.lookAtPoint values? Are they relative to the screen viewport? I would like to map them to a screen point to show where on the screen you are looking at, but I'm not sure how these values translate.

    https://forum.unity.com/threads/how-to-convert-arfaceanchor-lookatpoint-to-a-screen-point.594022/
     
  15. tripodsan

    tripodsan

    Joined:
    May 11, 2016
    Posts:
    2
    I experienced that using .png textures always produced error reporting that the image has not valid physical size. using a a .jpg worked.
     
  16. MarioRuiz

    MarioRuiz

    Joined:
    Nov 7, 2009
    Posts:
    161
    Hi guys, I'm a little late to the party and have read thread the thread but still got problems making the remote work with face tracking, I built the face tracking able remote following the tutorial on the unity blog, I get the console connection going but when I click the connect button on the scene it's not doing anything or crashing. Also the iPhone X stays at "waiting for editor connection..."
    This is what Xcode throws at me
    NullReferenceException: A null value was found where an object instance was required.

    at UnityEngine.XR.iOS.UnityRemoteVideo_OnPreRender ()

    Anybody else got problems connecting the remote for face tracking?

    Edit: The Xcode project shown libiconv.2.dylib in red, after googling around a bit I found that the .dylib file got replaced for a .tbd file with the same name and changed it, now my only clue is the Xcode null ref

    NullReferenceException: A null value was found where an object instance was required.

    at UnityEngine.XR.iOS.UnityRemoteVideo_OnPreRender () [0x00000] in <filename unknown>:0

    It still doesn't work

    EDIT 2: Looking at the comments on bitbucket looks like Jimmy is on the case and the remote will work again soon enough. Thanks in advance @jimmya
     
    Last edited: Dec 12, 2018
  17. VeerpalK

    VeerpalK

    Joined:
    Jul 4, 2018
    Posts:
    14
    Hi,
    i am using vertical plane detection in AR kit unity i was facing the issue like my object on vertical plane was places inverted in any axis, i fixed this issue by making my object's rotations so that it places on vertical plane as expected.
    But i am now facing the issue of translation of my object in vertical plane in X-axis and Y-axis .
    I have attached rigid body to my object and attached translation script as
    Capture.PNG
    Code (CSharp):
    1. using UnityEngine.Experimental.XR;
    2. using UnityEngine.XR.ARFoundation;
    3. using UnityEngine;
    4. public class TranslateFinal : MonoBehaviour
    5. {
    6.     Vector3 dist;
    7.     Vector3 startPos;
    8.     float posX;
    9.     float posZ;
    10.     float posY;
    11.  
    12.     public TrackableType detectedPlane { get; set; }
    13.  
    14.     void OnMouseDown()
    15.     {
    16.         startPos = transform.position;
    17.         dist = Camera.main.WorldToScreenPoint(transform.position);
    18.         posX = Input.mousePosition.x - dist.x;
    19.         posY = Input.mousePosition.y - dist.y;
    20.         posZ = Input.mousePosition.z - dist.z;
    21.     }
    22.  
    23.     void OnMouseDrag()
    24.     {
    25.         float disX = Input.mousePosition.x - posX;
    26.         float disY = Input.mousePosition.y - posY;
    27.         float disZ = Input.mousePosition.z - posZ;
    28.         Vector3 lastPos = Camera.main.ScreenToWorldPoint(new Vector3(disX, disY, disZ));
    29.         transform.position = Vector3.Lerp(transform.position, new Vector3(lastPos.x,lastPos.y, transform.position.z), Time.deltaTime * 25f);
    30.     }
    31.  
    32.     void OnCollisionEnter(Collision col)
    33.     {
    34.         if (col.gameObject.tag == "Plane")
    35.         {
    36.             Vector3 dir = col.contacts[0].point - transform.position;
    37.             var normal = col.contacts[0].normal;
    38.             if (normal.y > 0)
    39.             {
    40.                 return;
    41.             }
    42.             else
    43.             {
    44.              
    45.                 dir = -dir.normalized;
    46.                 float deltaX = dir.x > 0 ? 0.1f : -0.1f;
    47.                 float deltaZ = dir.z > 0 ? 0.1f : -0.1f;
    48.                 float deltaY = dir.y > 0 ? 0.1f : -0.1f;
    49.                 transform.position = new Vector3(transform.position.x + deltaX, transform.position.y+deltaY, transform.position.z);
    50.             }
    51.         }
    52.     }
    53. }
    54.  
    55.  
     
  18. daviner

    daviner

    Joined:
    Feb 14, 2018
    Posts:
    10
    ARKit has image recognition callbacks use ARAnchorAdded() when the image is detected, ARAnchorUpdated() when the image is in view to update each frame and ARAnchorRemoved() when the image is not in camera view. Try use ARAnchorRemoved() and look at the GenerateImageAnchor.cs in the UnityARImageAnchor scene example. It's on the plugin in Github
     
  19. daviner

    daviner

    Joined:
    Feb 14, 2018
    Posts:
    10

    Had a similar issue with this, I wanted to have objects falling vertically so on my Rigidbody for Freeze Rotation I did X Y and Z. I can't remember exactly but for vertical planes the axis reverses so y is z and z is y so you may want to test and change to:


    float disX = Input.mousePosition.x - posX;
    float disY = Input.mousePosition.y - posZ;
    float disZ = Input.mousePosition.z - posY;

    It may be different but as far as I can remember in unity a vertical plane is displayed on the z axis as a plane would be and thats why I used these orientations instead
     
  20. BShoyer

    BShoyer

    Joined:
    Sep 19, 2017
    Posts:
    3
    I am interested in creating a new ARReferenceImage at runtime or modifying an existing one using a screenshot, i have checked a few forums and it seems this is possible natively for arkit and found a few hacky ways to make it work using the Unity plugin but only for 1.5. Is this possible to accomplish with the current Unity plugin or is this a goal to incorporate?
    - Thank You
     
    Last edited: Feb 14, 2019
  21. fr_unity

    fr_unity

    Joined:
    Sep 8, 2016
    Posts:
    8
    @jimmya

    Hi, I am fairly new to this kit but looks great.
    I am having problems saving objects using the Unity Object Scanner scene.
    I have successfully built and ran the app on my iPhone8, I can add and modify the Object, but when I click Save Object it does not save it. Also, the folder ARReferenceObjects does not appear in my iTunes>iPhone>FileSharing>UnityARKitPlugin when I plug in my iPhone. I can only see a Unity folder.

    Could you help me understand what I am doing wrong?
    Thank you

    Unity 2018.3.12f1
    Mac 10.14.4
    Xcode 12
    iPhone8 12.2

    EDIT:

    It works, i was selecting the wrong object ! thanks anyway
     
    Last edited: Apr 15, 2019
  22. Finer_Games

    Finer_Games

    Joined:
    Nov 21, 2014
    Posts:
    33
    @noseeevil Having the same issue at a live installation. Were you ever able to resolve this?

    @jimmya I don't fully understand your reasoning for why this is the expected behaviour/object tracking should stop working, after the app regains focus. I'd certainly expect it to keep working. Is this how Apple's implementation works under the hood? I understand the ARKitPlugin is no longer supported as-of June 3rd -- is this also how it works in ARFoundation?
     
    Last edited: Jun 5, 2019
  23. zyonneo

    zyonneo

    Joined:
    Apr 13, 2018
    Posts:
    386
  24. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    You should probably try out ARKit3 and see if it works better for you: https://blogs.unity3d.com/2019/06/06/ar-foundation-support-for-arkit-3/
     
  25. Finer_Games

    Finer_Games

    Joined:
    Nov 21, 2014
    Posts:
    33
    @jimmya This required a fairly major Unity upgrade, as well as a ARKit -> ARFoundation, macOS, XCode (beta), and iOS (13 beta), but object tracking now seems to be working as expected.
     
    ROBYER1 and jimmya like this.
  26. Finer_Games

    Finer_Games

    Joined:
    Nov 21, 2014
    Posts:
    33
    @jimmya Unfortunately, further testing has revealed that there are edge cases where it does not work as expected. It depends on a mix of the screen sleep timer, guided access, power being supplied, and how the device is woken up. Sometimes, forcing the AR Session to restart does not even fix it. Will try and post precise repro steps in the tracker, but for anyone coming here via search, fyi...
     
  27. Finer_Games

    Finer_Games

    Joined:
    Nov 21, 2014
    Posts:
    33
    Quick update here for people coming via search.

    The issue only seems to happen when using either tap/raise to wake, with guided access on. Disabling both tap and raise to wake in our installation, as a temporary workaround.

    I'll try and get more info/see if there are any other scenarios where object tracking does not resume as expected, when the device is woken up, in the meantime.