Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

ARKit 2.0 Beta support

Discussion in 'AR' started by jimmya, Jun 5, 2018.

  1. jose999

    jose999

    Joined:
    Oct 20, 2015
    Posts:
    2
    Hi Again Pilltech101
    I find the problem, give to the remote app camera permissions.

    Now I see just a frame of the device camera, Could it be related with the usb 2.0 port?
     
  2. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
  3. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Hi folks,
    Really appreciate the people who are on here helping others with common issues they have dealt with. We're getting a lot of posts that are repeats of issues or commonly known problems in the pipeline, and it helps us to keep providing updates when there are others helping with answering some of the more obvious support questions. Also, a reminder that the ARKit plugin is open source, and we are looking for people to contribute any improvements or bug fixes to the repo (use a pull request).

    Thanks all!
    Your friendly neighborhood XR developers :)
     
    rob_ice and mkusan like this.
  4. harperAtustwo

    harperAtustwo

    Joined:
    Nov 15, 2016
    Posts:
    25
    Is there a way to get a callback for when ARKit successfully relocalized?
     
  5. harperAtustwo

    harperAtustwo

    Joined:
    Nov 15, 2016
    Posts:
    25
    I found this in the apple docs.
    So I guess I could just check against the previous state.
     
  6. harperAtustwo

    harperAtustwo

    Joined:
    Nov 15, 2016
    Posts:
    25
    I have a .bytes file saved from scanning and saving an ARMap. deserialize this in the editor so I can look at the and visualize the world map data in Unity?
     
  7. HulloImJay

    HulloImJay

    Joined:
    Mar 26, 2012
    Posts:
    89
    Hi @jimmya, I'm sorry if this is an odd place to ask this, but can you explain the current situation with the "AR XR Plugin" that is available via the Package Manager, the ARKit Plugin in the Asset Store, and the various branches (including ARKit 2.0 beta) in the BitBucket?

    Specifically: What's most current, what's compatible between them (changing the plugin without refactoring our own code), and what should we start using now if we want to keep up to date moving forward?
     
  8. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    265
    I've got 2 questions related to mesh planes:

    1. How can I have a mesh collider that matches the generated mesh planes and is still performant? I've got a setup that kinda works right now (by modifying the ARKitPlaneMeshRender script) but it's pretty slow.

    2. How can I stop the UnityARGeneratorPlane script from continuing to generate more planes? I want to have an explicit "scan" step in my app, then once you've done that it just locks all the planes in the scene while you move stuff around in it.
     

    Attached Files:

  9. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Short answer is you should be using ARFoundation as this is what we're going to be moving forward with. Long answer might depend on what features you need right now - we're still implementing some of the later platform features as subsystems now. See https://forum.unity.com/forums/handheld-ar.159/ and in particular this thread: https://forum.unity.com/threads/multi-platform-handheld-ar.536313/
     
    SourceKraut and HulloImJay like this.
  10. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    theoretically, though we have not implemented the editor side ARWorldMap object or a way to visualize it.
     
    SourceKraut likes this.
  11. me_Stoffel

    me_Stoffel

    Joined:
    Dec 5, 2014
    Posts:
    7
    Hey @jimmya, great work, package works well so far :) integrated it into a prototype to test the image referencing and everything was quite easy to set up. However currently I'm struggling to add new images references loaded from disk/a server to the reference set. What I'm trying to do is to copy the ARRefrenceImageSet and ARRefrenceImage I have attached to the camera manager and restart the session with a new configuration referencing these newly generated objects:


    Code (CSharp):
    1. // Load image to texture
    2. var fileData = File.ReadAllBytes(imagePath);
    3. var texture = new Texture2D(2, 2);
    4. texture.LoadImage(fileData);
    5.  
    6. // Create new referenceSet and image
    7. var refSet = ScriptableObject.Instantiate(CameraManager.detectionImages);
    8. var refImage = ScriptableObject.Instantiate(CameraManager.detectionImages.referenceImages[0]);
    9. refImage.imageName = "newMarker";
    10. refImage.imageTexture = texture;
    11. refImage.physicalSize = 0.21f;
    12. refSet.referenceImages[0] = refImage;
    13. refSet.resourceGroupName = "newRefGroup";
    14.  
    15. CameraManager.detectionImages = refSet;
    16. CameraManager.Session.RunWithConfig(CameraManger.sessionConfiguration);
    Which ends up telling me in XCode:

    [General] No resource group with name “newRefGroup” found

    Which makes sense, I guess, as these images are preprocessed in XCode if part of the unity build as far as I understand it.

    What would be the correct way to add or switch image references at runtime with an image loaded from disk (e.g. streaming assets)? Should it be possible in general and is there something available to do it within Unity?
     
    Last edited: Jul 19, 2018
  12. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    It should be possible to do, but we have not added this functionality yet.
     
    SourceKraut likes this.
  13. smallbit

    smallbit

    Joined:
    Oct 3, 2013
    Posts:
    60
    Hi, I am trying to implement the multiplayer using AR worldmap. I played quite a lot with the shared spheres examples, and I must say I am a bit disappointed by the quality of it. The best precision I could even achieve between two devices was around 40 cm (Lets say I drop the ball on the table corner and other device sees it 40 cm away), most of the time is much more, and 1 out of 5 times after few minutes it gets completely desynchronized. Is this the best this tech can do, or there are ways to improve the result, if yes what are they? Send the world map periodically? Or maybe this is fault of AR initialization. Thanks for any pointers.
     
  14. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Seems like you are not getting the expected results. You can see from the video of the demo that my sphere shows up exactly where they have been laid down in front of the other device when I create it. I'm not sure what exactly you might be doing differently: you might want to scan the area around which you are tracking for a while before you start.
     
  15. smallbit

    smallbit

    Joined:
    Oct 3, 2013
    Posts:
    60
    Actually the demo video shows same problem (but to less scale),
    . Check the second ball being dropped (time 0:02) on the iphone x screen, it should be seen exactly where the other device is but instead it goes to the left 10-15 cm, which appears even more when iphone changes perspective seconds later. also desync can be seen later on the video. The demo is great but I was just wondering what kind of precision we can get. Also have a question about the state "WaitForLocationSync" of demo project, Is there any action user can take to improve it's time? every time I tried it took long time to get passed regardless how long and precise I scan. Whats the best practice to improve it? is it better to sync the devices from same point of view ? or it doesn't matter.

    Thanks for your help.
     
  16. me_Stoffel

    me_Stoffel

    Joined:
    Dec 5, 2014
    Posts:
    7
    rob_ice likes this.
  17. BocataDeClavos

    BocataDeClavos

    Joined:
    May 12, 2016
    Posts:
    1
    Hi, I'm having a problem with the UnityARObjectAnchor example.
    When I try to run it on my iPad, it crashes when I launch it. It's weird because I can run other examples without any problem, but this particular scene (and UnityTongueAndEyes) keeps crashing.
    This is the log from Xcode:

    2018-07-24 16:30:42.251708+0200 arkitscene[998:104612] [MC] System group container for systemgroup.com.apple.configurationprofiles path is /private/var/containers/Shared/SystemGroup/systemgroup.com.apple.configurationprofiles

    2018-07-24 16:30:42.252114+0200 arkitscene[998:104612] [MC] Reading from public effective user settings.

    2018-07-24 16:30:43.475772+0200 arkitscene[998:104766] [Technique] ARWorldTrackingTechnique(0x105fad730, 0x10612fc00) error setting options for object detection: DeprecatedData, {

    ObjectsDataIdentifiers = (

    "9A43C2BF-C9D6-7051-F369-905BF91A6582"

    );

    ObjectsDatas = (I omitted a huge amount of hex numbers here) <…>

    2018-07-24 16:30:43.490755+0200 arkitscene[998:104766] [Session] Session (0x105f9ee40): did fail with error: Error Domain=com.apple.arkit.error Code=301 "Invalid reference object." UserInfo={NSLocalizedDescription=Invalid reference object., NSLocalizedFailureReason=The reference object data is not in a format supported by this version of ARReferenceObject.}

    I'm using Unity version 2018.1.1f1, though I also tried a few others, but it didn't make any difference.
    I installed iOS 12 and xcode 10, set target to iOS 12, etc, but I just can't get this to work.
    I would really appreciate it if someone could give me any ideas, because I don't know what else to try.
    Thanks in advance!
     
  18. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Looks like you might not have the latest version of iOS 12 beta on your device.
     
  19. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Try this out yourself instead of judging from the video - the perspective might be playing tricks on you, and there might be a slight delay between the creation of the balls and its appearance on the other players device. BTW, I put in an offset of 2cm from the front of the camera to try and make the ball visible when created - you can change this in ExamplePlayerScript.cs
     
    rob_ice likes this.
  20. necron4a

    necron4a

    Joined:
    Oct 6, 2017
    Posts:
    1
    Hi, I started to learn this plugin and I have a question. In example scenes the cube (RandomCube) does not stay fixed, it jitter. Is it normal or I do something wrong? Thanks.

    2017.4.3.f1
    Ipad 6
    iOs 12 beta4
    Xcode 10 beta

    Update: the whole environment is shaking synchronously
     
    Last edited: Jul 25, 2018
  21. zyonneo

    zyonneo

    Joined:
    Apr 13, 2018
    Posts:
    386
    Hi.I was looking forward to do a project in ARkit 2.I want to know how Object tracking is done?.Is there any tutorial?I dont know what we are giving as object data(for example CAD data) for our device to recognize while tracking?I did not find any Object Tracking tutorial.
     
  22. zyonneo

    zyonneo

    Joined:
    Apr 13, 2018
    Posts:
    386
    Hey I tried the scene UnityARObjectAnchor.unity .I took the build in ipad pro(2017 model) with IOS running version-12.0(16A5327f).I was using Unity version-2018.1.1f1 , ARkit 2.0 and Xcode 10 beta 4 . But when I took build in the device after made with unity Splash screen a black screen comes and stops. Below is Xcode Debug error report.

    @jimmya
    Code (CSharp):
    1. 2018-07-27 17:08:16.195988+0530 ObjTra[701:87170] [DYMTLInitPlatform] platform initialization successful
    2. 2018-07-27 17:08:16.235735+0530 ObjTra[701:87129] Built from '2018.1/release' branch, Version '2018.1.1f1 (b8cbb5de9840)', Build type 'Release', Scripting Backend 'il2cpp'
    3. 2018-07-27 17:08:16.238396+0530 ObjTra[701:87129] -> registered mono modules 0x103c2ad70
    4. -> applicationDidFinishLaunching()
    5. 2018-07-27 17:08:16.389638+0530 ObjTra[701:87129] Metal GPU Frame Capture Enabled
    6. 2018-07-27 17:08:16.390538+0530 ObjTra[701:87129] Metal API Validation Disabled
    7. 2018-07-27 17:08:16.451085+0530 ObjTra[701:87129] [Warning] Trying to set delaysTouchesBegan to NO on a system gate gesture recognizer - this is unsupported and will have undesired side effects
    8. -> applicationDidBecomeActive()
    9. GfxDevice: creating device client; threaded=1
    10. Initializing Metal device caps: Apple A10X GPU
    11. Initialize engine version: 2018.1.1f1 (b8cbb5de9840)
    12. UnloadTime: 3.424208 ms
    13. 2018-07-27 17:08:19.038964+0530 ObjTra[701:87129] [MC] System group container for systemgroup.com.apple.configurationprofiles path is /private/var/containers/Shared/SystemGroup/systemgroup.com.apple.configurationprofiles
    14. 2018-07-27 17:08:19.039200+0530 ObjTra[701:87129] [MC] Reading from public effective user settings.
    15. -> applicationWillResignActive()
    16. 2018-07-27 17:08:19.677594+0530 ObjTra[701:87343] [Technique] ARWorldTrackingTechnique(0x12fd3b5a0, 0x130053e00) error setting options for object detection: DeprecatedData, {
    17.     ObjectsDataIdentifiers =     (
    18.         "9A43C2BF-C9D6-7051-F369-905BF91A6582"
    19.     );
    20.     ObjectsDatas =     (
    21.         <b156494f 44617461 62617365 53616d70 6c650101 01ce5e41 ca040101 99cbbf85 34645a43 755ecbbf efff74c4 b24229cb bf74b6dd 2ef38aca cbbfeffe 18d1ad39 21cb3f85 657aeb09 889ccbbf 934e0322 b32ef2cb 3f935b88 d5f11074 cb3f73e8 f4fc1f3a 80cbbfef fe7078b1 ead993cb 3f90c88a 47ecfe9b cb3fb79f a97e132b 56cb3f33 15867a02 249d01af 50657273 70656374 6976654c 656e731b cb406ff9 f60f74a0 74cb0000 00000000 0000cb40 <>
    22. 2018-07-27 17:08:19.680284+0530 ObjTra[701:87343] [Session] Session (0x12ff733f0): did fail with error: Error Domain=com.apple.arkit.error Code=301 "Invalid reference object." UserInfo={NSLocalizedDescription=Invalid reference object., NSLocalizedFailureReason=The reference object data is not in a format supported by this version of ARReferenceObject.}
    23. -> applicationDidBecomeActive()
    24. -> applicationWillResignActive()
    25. -> applicationDidEnterBackground()
    26.  
     
    esoinila likes this.
  23. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    The object data that is saved in the project is incompatible with the latest ARKit SDK version, which you downloaded when you downloaded XCode 10 Beta 4 - you can create your own new reference object using either Apple's sample 3D scanning app or the UnityObjectScanner example.
     
    zyonneo and SourceKraut like this.
  24. smallbit

    smallbit

    Joined:
    Oct 3, 2013
    Posts:
    60
    My original post concerns were based on trying it myself, then you claimed video proves it works accurately. This post was just to tell you it's untrue.
     
  25. Vobbler

    Vobbler

    Joined:
    Jan 22, 2018
    Posts:
    1
    rob_ice likes this.
  26. dnoparker

    dnoparker

    Joined:
    Aug 28, 2013
    Posts:
    63
    I have scanned an object and used the .arobject file to track it using the Unity example scene. It finds the object pretty well but there seems to be a performance issue.
    The app jitters/freezes up every few seconds. The XCode debug says:

    I haven't made any changes to the scripts provided. I am simply using my own .arobject that I scanned with this app

    Any ideas?
     
    Last edited: Aug 2, 2018
  27. yesbradd

    yesbradd

    Joined:
    Apr 4, 2018
    Posts:
    7
    Hi Everyone

    Has anyone had success at persistent AR with the new world map. So far ive got it working with just spawning the saved objects using the old location which worked once it had relocated. Which isn't ideal because if you go to a different place to where the orignal World Map was generated they will be there.

    So I wanted to switch to anchors which is what Apple suggests here.

    https://developer.apple.com/documentation/arkit/creating_a_multiuser_ar_experience

    This seems like the best option. Just Wondering if anyone knew how to go about it this way. Instead of just saving their positions.

    Cheers
     
    Last edited: Aug 2, 2018
  28. zulaman

    zulaman

    Joined:
    May 12, 2017
    Posts:
    26
    Hi there, I was able to get the UnityObjectScanner to build on my iPhone X but it doesn't seem to function the same way as the video. The tracking points don't show up and I can't place the bounding box to scan the object.
    First, I have to click on the "Detect Objects" button to start the camera. Then I tap to create the box, but it just doesn't line up with the planes (which I assume are not being detected). Also, if I tap with 2 fingers the app crashes right away.
    Any ideas? Thanks
     
    esoinila likes this.
  29. andwoo1

    andwoo1

    Joined:
    Feb 27, 2017
    Posts:
    6
    Issue
    I encountered the same issue as ifeltdave https://forum.unity.com/threads/arkit-2-0-beta-support.534639/#post-3522275 where building an ARKit 1.5 project with Xcode 10.0 beta 5 targeting iOS 11.3 caused iOS 12.0/ARKit 2.0 devices to appear to freeze if the app used image targets. The actual app was not frozen, just the camera feed froze on the first frame and the below debug error log printed.

    Code (CSharp):
    1. [Session] Session (0x1140c0790): did fail with error: Error Domain=com.apple.arkit.error Code=300 "Invalid reference image." UserInfo={NSLocalizedFailureReason=One or more reference images have an invalid size: [list of images in set], NSLocalizedRecoverySuggestion=Make sure that all reference images are greater than 100 pixels and have a positive physical size in meters., ARErrorItems=( [list of images in set]), NSLocalizedDescription=Invalid reference image.}
    Root Cause
    The Error log was printed right after the method
    StartWorldTrackingSessionWithOptions
    executed in
    ARSessionNative.mm
    . The images in Xcode did have valid physical sizes so I put some logs into the
    StartWorldTrackingSessionWithOptions
    method right after the referenceImages were set.

    Code (CSharp):
    1. for(ARReferenceImage* img in config.detectionImages) {
    2.     NSLog(@"%@ width = %f, height = %f", [img name], [img physicalSize].width, [img physicalSize].height);
    3. }
    All my image set images reported as 0.0 by 0.0

    image_name width = 0.000000, height = 0.000000


    This aint good but when setting the deployment target to 12.0, the sizes reported correctly on iOS 12.0 devices. I needed to support both 1.5 and 2.0 image detection and setting the deployment target to 12.0 would not really fix my issue.

    Resolution
    Just as a test, I set the
    physicalSize
    of the image to some pre determined values to see if it would work and it did!

    Code (CSharp):
    1. for(ARReferenceImage* img in config.detectionImages) {
    2.    if([img physicalSize].width <= 0 || [img physicalSize].height <= 0) {
    3.  
    4.            float newWidth = 0.12f;
    5.            float newHeight = 0.18923f;
    6.  
    7.            CGSize newSize = CGSizeMake(newWidth, newHeight);
    8.            NSValue *rectValue = [NSValue valueWithCGSize:newSize];
    9.            [img setValue:rectValue  forKey:@"physicalSize"];
    10.        }
    11.    }
    12. }
    Setting positive non 0 values as the
    physicalSize
    unfroze the camera and image detection functioned as intended. I modified my version of the plugin to take in a backup physical sizes list to use in case sizes of 0 came up.

    I didn’t dig deeper into this due to time constraints so I’m unsure if it’s a bug in the Unity plugin or a Xcode/ARKit.

    Tested On
    Xcode - 9.3, 9.4 and 10.0 beta 5
    Unity - 2017.4.1f1
    iOS - 11.4 and 12.0
     
  30. kengorou2014

    kengorou2014

    Joined:
    Jun 12, 2018
    Posts:
    8
    hello,
    I'm seeking for a way how to save an virtual objects, which arerelated to the arworldmap.
    For my situation, there are some different arworld maps with different virtual objects alignment, so my arworldmaps's virtual object situation has to be distinguished by other arworldmaps. Can you tell me how to solve this problem? Thank you.
     
  31. andwoo1

    andwoo1

    Joined:
    Feb 27, 2017
    Posts:
    6
    Replying to my previous post https://forum.unity.com/threads/arkit-2-0-beta-support.534639/page-3#post-3585982 I think the issue might be Unity related. I tried a basic swift image detection tutorial and setting the deployment target to 11.3 and deploying to a 12.0 device and the images had valid sizes and the camera feed didn't freeze.

    Do you think there's a difference between the swift and objective c referenceImages methods under the hood?

    swift
    Code (CSharp):
    1. let referenceImages = ARReferenceImage.referenceImages(inGroupNamed: "resource_group_name", bundle: nil)
    objective-c
    Code (CSharp):
    1. NSSet<ARReferenceImage *> *referenceImages = [ARReferenceImage referenceImagesInGroupNamed:mad:"resource_group_name" bundle:nil];
     
  32. TechnicalArtist

    TechnicalArtist

    Joined:
    Jul 9, 2012
    Posts:
    736
    Hey,

    When I enable Use HDR in graphics setting my app getting crashed.
     
  33. Finer_Games

    Finer_Games

    Joined:
    Nov 21, 2014
    Posts:
    33
    Just a heads up for anyone experiencing no change in the camera feed (freezing, crashing, etc.) after the first frame, when using image tracking. You need to make sure your deployment target is set to 12.0 in XCode. Importantly, specifying the Target minimum iOS version in Unity to 12.0 (at least in 2018.2.1f1), sets the actual deployment target in XCode to 11.3. This stumped me for a good day or two, because we have automated Jenkins builds running for all of our apps, and some new 2.0 features -- like saving and loading world maps -- were working, even though the deployment target had been 11.3.

    I also have a question for @jimmya re: anchors. Our use case is we want users to be able to scan a room to create a world map, find images within the room to set reference points for key content, and save those points as anchors within the world map for persistence between sessions. I'm noticing I get no callbacks for image anchors when loading a saved world map. Confusingly, the ARAnchorAdded/Updated/Removed events seem to only apply to plane and user anchors, and are the only types that get added/updated callbacks when a world map is loaded. I can't seem to retrieve much data from the world map itself, aside from center and extents, which I believe change throughout the session and seem like an unreliable way of anchoring an object (unless I'm mistaken and the center is fixed?). I don't want to rely on plane anchors either, for much the same reasons.

    Are plane and user anchors the only types that get callbacks when loading a saved world map? If so, is there a recommended way other than managing a store + relationships between image anchors/user anchors, for for tracking and restoring image anchors between sessions, eg. when a world map is retrieved?
     
    Last edited: Aug 10, 2018
    amasinton likes this.
  34. brainbreaker

    brainbreaker

    Joined:
    Jun 25, 2016
    Posts:
    1
    @jimmya @all I'm having trouble running ARKit remote. Not sure if this is right place to ask but would appreciate any inputs.

    I tried to run the EditorTestProject scene after downloading the plugin from the first post of this thread. I am building debug version and deployment target set to 12.0 using XCode 10.0 beta 5 & Unity 2018.2.1f1. My device(iPhone 7 Plus running iOS 12.0 Beta) gets connected successfully but both editor and app freezes after sending the first frame.

    XCode logs are available here - https://gist.github.com/brainbreaker/955e5698492d2e066ca96caa4e09a519
    Unity console doesn't show any relevant error logs.

    My question is also available here - https://answers.unity.com/questions/1541209/arkitremote-freezes-after-sending-first-frame.html
     
    unity_LC68BHbrmVgH5Q likes this.
  35. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    I'll be taking a look at this and other bugs with the plugin this week.
     
    unity_LC68BHbrmVgH5Q likes this.
  36. amasinton

    amasinton

    Joined:
    Aug 12, 2006
    Posts:
    137
    I'd LOVE to have some insight into this, too.

    I'm trying to align a virtual building interior to its real-world counterpart, but getting access to common features in the world map point cloud is problematic. For now, I'm trying to use image tracking to set position/scale/orientation of the model and the worldmap simply to track user position in the space. But it would be nice to be able to register the worldmap point cloud to a model, and then stabilised with reference to image anchors - all done by the designer before-hand (in the Unity Editor using a saved worldmap) so that the user just opens the app, relocalizes the device, and the model appears at the correct position, scale, and orientation.

    There must be a way to do this?
     
  37. Finer_Games

    Finer_Games

    Joined:
    Nov 21, 2014
    Posts:
    33
    @jimmya cheers for looking into some of this. The commits from the 14th addressed some of the issues we were having restoring user anchors (were always being set to the world/session origin). Thank you so much!

    Question re: user anchors: When I add a user anchor (right now, with a position we're getting from an image target), then move the user anchor (also, from a position we're getting from an image target), and then load the world map, the user anchors are restored to the position at which they were initially added.

    Is there a way to update the positions of user anchors that have been added? Or, should we instead be removing and re-adding user anchors whenever we want to update their positions (and have them re-added at the new positions when a world map is loaded)?
     
    Last edited: Aug 15, 2018
  38. kengorou2014

    kengorou2014

    Joined:
    Jun 12, 2018
    Posts:
    8
    Hello
    I have been in Unity for three months and I have a lot of unknown things so I will ask you a question.
    I am trying to realize multiplayer using a serialized arworldmap. I load arworldmap saved in my iphone beforehand, convert it to byte [] and try to share it with photon, but byte [] is different from what was saved in iphone. It is confirmed that byte [] has not changed before and after sending / receiving with photon.
    Code goes like this.
    In this code, errors do not occur in remoteworldmap.center, remoteworldmap.extent, but an error occurs after that.
    Code (CSharp):
    1. ARWorldMap localworldMap = ARWorldMap.Load (worldpath);
    2. photonView.RPC ("ReceiveWorldMap", PhotonTargets.OthersBuffered, worldMap.SerializeToByteArray ());
    3.  
    4.     [PunRPC]
    5.     void ReceiveWorldMap (byte [] world_bytes) {
    6.         ARWorldMap remoteworldmap = ARWorldMap.SerializeFromByteArray (world_bytes);
    7.  
    8.         if (remoteworldmap! = null)
    9.         {
    10.             m_LoadedMap = remoteworldmap;
    11.             Debug.LogFormat ("Map loaded. Center: {0} Extent: {1}", remoteworldmap.center, remoteworldmap.extent);
    12.  
    13.             UnityARSessionNativeInterface.ARSessionShouldAttemptRelocalization = true;
    14.  
    15.             var config = m_ARCameraManager.sessionConfiguration;
    16.             config.worldMap = remoteworldmap;
    17.             UnityARSessionRunOption runOption = UnityARSessionRunOption.ARSessionRunOptionRemoveExistingAnchors | UnityARSessionRunOption.ARSessionRunOptionResetTracking;
    18.  
    19.             Debug.Log ("Restarting session with worldMap");
    20.             session.RunWithConfigAndOptions (config, runOption);
    21.         }
    22.     }
    Please tell me a nice way.
     
  39. mkrfft

    mkrfft

    Joined:
    Sep 8, 2017
    Posts:
    22
    Since the ARKit 2.0 beta I don't get any tracking by looking just for
    Code (CSharp):
    1. ARHitTestResultType.ARHitTestResultTypeEstimatedVerticalPlane
    Is there a better way to look for only vertical planes?
     
  40. brunzero

    brunzero

    Joined:
    Jul 10, 2015
    Posts:
    8
    hey there. is there any way for me to run an ARImageTrackingConfiguration session?
     
  41. deets

    deets

    Joined:
    Feb 10, 2015
    Posts:
    11
    When I try to access the worldmap.pointcloud.points to use them in my visualisation the app crashes.
    I have been breaking my head over this one, it seems so simple yet it crashes everytime I try to acces the pointcloud.
    Anyone that can help me out?


    _nativeSession.GetCurrentWorldMapAsync(OnWorldMap);



    void OnWorldMap(ARWorldMap worldMap)
    {
    if (worldMap != null)
    {
    //_WorldMap = worldMap;
    Debug.LogFormat("ARWorldMap Assigned");
    worldMap.Save(Path.Combine(Application.persistentDataPath, "myFirstWorldMap.worldmap"));
    Debug.LogFormat("ARWorldMap SAVED");
    if(worldMap.pointCloud.Points != null)
    OnWorldMapPointCloudUpdated(worldMap.pointCloud.Points);
    else
    Debug.LogFormat("Pointcloud.Points is null");
    }
    }
     
    esoinila likes this.
  42. CharlesBarros

    CharlesBarros

    Joined:
    Nov 17, 2011
    Posts:
    61
    Is it possible to use the tongue detection (that only works with front camera) while we have an ARSessionNativeInterface in the rear camera?
    I was trying to use the facial information of the user that's holding the cellphone to animate a virtual character in the screen that is positioned in the rear camera.

    Thanks!
     
    rob_ice likes this.
  43. amasinton

    amasinton

    Joined:
    Aug 12, 2006
    Posts:
    137
    I've been trying to do this too, and here's what I have discovered over the past week:

    1. You can't get at the worldmap.pointcloud.points. At all. Not even getting the length of the points array. When you try to access the points array (for anything), your app will crash. ARWorldMap is effectively a closed object, so leave it alone (except for saving and loading, obviously).

    2. BUT you can get the cloud of feature points from the ARCamera. These feature points are not used in the worldmap itself, they are purely for visual feedback. UnityARCamera.pointCloudData is an array of Vector3 which gets rewritten every frame. If you want to display the pointcloud in a cumulative way as the user moves around a space, then you will need to add to a master list of Vector3 every frame. This can get very large very quickly (I've built up about 5 million points during a 5 minute session). So, you will want to perform some sort of filtering on your point cloud so that you're dumping most of your data (90% removed, seems to be a good rule-of-thumb), which leaves just enough density for your user to understand the feedback they are getting while not burning up all of your memory or overwhelming the user.

    3. You can also write out the array of camera pointcloud points to a file which you can download from the device and import into various external 3d editing/creation apps as reference to the worldmap you have recorded in a previous session. The camera pointcloud exists in the worldmap's coordinate system, so anything you align to that exported point cloud will be displayed in the correct place in the scene in-device, when a new session is run and the device relocalizes. Which is backward from the way most AR development workflows seem to be designed, but for my project needs, this is exactly what we have to do. We build AR experiences for specific spaces, not AR for any space (which seems to be more standard).

    4. The camera pointcloud is very, very noisy. Also, it is clear that sometimes the device misunderstands the shape of the world it is mapping, and so introduces distortions. A more positive note is that the orientation of the pointcloud (and the worldmap) seems to take into account both gravity and compass heading and the units seem be in meters (roughly). I've attached an image of the pointcloud generated from a mapping session I did earlier this week in a medieval church. You can see just how noisy the cloud is (but also kind of pretty).

    So, in a way, I have just answered my question from last week - see above.

    I'd be very interested if anyone else has something to share (or correct) in terms of the point clouds.

    I hope this is helpful!

    MLPointCloudSC00Crop.jpg
     
  44. MSFX

    MSFX

    Joined:
    Sep 3, 2009
    Posts:
    116
    Is it possible to produce a build that supports ARKit 2.0 but will also fallback and work on iOS 11 with a limited feature set? I'm assuming no but have heard rumours otherwise...? Thanks for the great work as always guys!!
     
  45. aporier

    aporier

    Joined:
    Jun 15, 2016
    Posts:
    5
    @jimmya I am trying to create a project using World Mapping in ARKit 2.0 with Unity 2018.2.4f1, XCode 10 beta, and iOS 12 beta on an iPhone X. Everything builds in Unity and in XCode, but when I try and use the app, the world saves but doesn't load. I know it's saving because I can see the .worldmap file in iTunes and in iExplorer under the Documents folder.
    XCode gives me this error when I try and load the map:

    Code (CSharp):
    1. Got hit!
    2. x:0.185453 y:-0.211173 z:0.096843
    3. ARWorldMap saved to /var/mobile/Containers/Data/Application/665C2F78-8AE0-493E-9864-433BFBDAF161/Documents/myFirstWorldMap.worldmap
    4. Loading ARWorldMap /var/mobile/Containers/Data/Application/665C2F78-8AE0-493E-9864-433BFBDAF161/Documents/myFirstWorldMap.worldmap
    5. 2018-08-28 17:29:16.506067-0500 WorldMap[629:56626] Error Domain=NSCocoaErrorDomain Code=257 "The file “Documents” couldn’t be opened because you don’t have permission to view it." UserInfo={NSFilePath=/var/mobile/Containers/Data/Application/665C2F78-8AE0-493E-9864-433BFBDAF161/Documents, NSUnderlyingError=0x283ca0db0 {Error Domain=NSPOSIXErrorDomain Code=13 "Permission denied"}}
    I haven't changed the code from the original World Map example. I have only changed the objects under the Hit Cube Parent.
    Thanks!
     
  46. MrMatthias

    MrMatthias

    Joined:
    Sep 18, 2012
    Posts:
    191
    You have to remove and add a new one. The transform of the anchor is read-only and there is no update method:
    https://developer.apple.com/documentation/arkit/aranchor/2867981-transform?language=objc
     
    jimmya likes this.
  47. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    curious what you're using user anchors for. you should not be moving user anchors directly - instead if you want to move them, you should remove the original one, and create a new one where you want to move. once you create a user anchor, only ARKit is supposed to be moving it (based on any changes to AR data it detects) [edit: just noticed @MrMatthias answered your question - thx]
     
    Last edited: Aug 30, 2018
  48. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    It should already work that way I believe - there is a IsARKit2_0_Supported method that can help you fallback to simpler cases if needed.
     
  49. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    No - ARKit limitation.
     
    CharlesBarros likes this.
  50. Finer_Games

    Finer_Games

    Joined:
    Nov 21, 2014
    Posts:
    33
    @jimmya Hey Jimmy, thanks for the reply. I figured this was the case, as it's how Hololens and other XR platforms work.

    We're doing an installation -- ideally, on-site staff could scan image targets once to place the content, save the map, then have the map loaded -- and those anchors restored -- when the app is relaunched, resumed, etc. Additionally, they could periodically tweak the positions of things and update the anchor positions stored accordingly.

    It seems that image anchors do not get saved/restored, only user and plane anchors. Our idea for a work-around was to still use image targets to get/set positions, and then add user anchors to save/restore them. The problem here is that there isn't an anchor store we can rely on/query, or a component that manages it for you (eg. Hololens), so we need roll our own, foolproof way of keeping the anchors we think we've saved -- and those that are actually stored in the world map -- in sync.

    If anyone has any clever ideas... else I'll reply here in a couple weeks and let everyone know how we decided to do it. ;)