Search Unity

  1. Dismiss Notice
  2. Looking for a job or to hire someone for a project? Check out the re-opened job forums.
    Dismiss Notice
  3. Unity wants to learn about your experiences working on a Unity project today. We'd like to hear from you via this survey.
    Dismiss Notice
  4. Unity 2020 LTS & Unity 2021.1 have been released.
    Dismiss Notice
  5. Good news ✨ We have more Unite Now videos available for you to watch on-demand! Come check them out and ask our experts any questions!
    Dismiss Notice
  6. Read here for Unity's latest plans on OpenXR.
    Dismiss Notice

Unity Using ARKit meshing with AR Foundation 4.0

Discussion in 'AR' started by todds_unity, May 6, 2020.

Thread Status:
Not open for further replies.
  1. todds_unity

    todds_unity

    Unity Technologies

    Joined:
    Aug 1, 2018
    Posts:
    168
    The latest preview versions of AR Foundation / ARKit 4.0 provides support for the scene reconstruction feature that became available in ARKit 3.5 and is enabled on the new iPad Pro with LiDAR scanner. It is now available on Unity 2019.3 or later.

    This new mesh functionality also requires Xcode 11.4 or later, and it only works on iOS devices with the LiDAR scanner, such as the new iPad Pro.

    Using the LiDAR sensor, ARKit scene reconstruction scans the environment to create mesh geometry representing the real world environment. Additionally, ARKit provides an optional classification of each triangle in the scanned mesh. The per-triangle classification identifies the type of surface corresponding to the triangle's location in the real world.

    AR Mesh Manager

    To use ARKit meshing with AR Foundation, you need to add the ARMeshManager component to your scene.
    arfoundation-mesh-manager.png
    Mesh Prefab

    You need to set the meshPrefab to a prefab that is instantiated for each scanned mesh. The meshPrefab must contain at least a MeshFilter component.

    If you want to render the scanned meshes, you will need add a MeshRenderer component and a Material component to the meshPrefab's game object.

    If you want to have virtual content that interacts physically with the real world scanned meshes, you will need to add MeshCollider component to the meshPrefab's game object.

    arfoundation-mesh-prefab.png

    This image demonstrates a mesh prefab configured with the required MeshFilter component, an optional MeshCollider component to allow for physics interactions, and optional MeshRenderer & Material components to render the mesh.

    Normals

    As ARKit is constructing the mesh geometry, the vertex normals for the mesh are calculated. Disable normals if you do not require the mesh vertex normals to save on memory and CPU time.

    Concurrent Queue Size

    To avoid blocking the main thread, the tasks of converting the ARKit mesh into a Unity mesh and creating the physics collision mesh (if the meshPrefab's game object contains a MeshCollider component) are moved into a job queue processed on a background thread. concurrentQueueSize specifies the number of meshes to be processed concurrently.

    Other ARMeshManager settings

    For the ARKit implementation, only these 3 settings affect the performance and output of ARKit meshing.

    Meshing behaviors

    Do note that it is typical for about 4 seconds to elapse after the meshing subsystem starts before the scanned meshes start to show up.

    Additionally, the LiDAR scanner alone may produce a slightly uneven mesh on a real-world surface. If you add and enable an ARPlaneManager to your scene, then ARKit considers that plane information when constructing a mesh. ARKit smooths out the mesh where it detects a plane on that surface.
     
    thorikawa, newguy123, yty and 2 others like this.
  2. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    220
    Last edited: May 6, 2020
  3. KingOfSnake

    KingOfSnake

    Joined:
    Feb 24, 2013
    Posts:
    31
    It looks like Cloud Builds fail because of missing XCode 11.4 support, needed for ARKit 3.5?
     
    StefanoCecere likes this.
  4. todds_unity

    todds_unity

    Unity Technologies

    Joined:
    Aug 1, 2018
    Posts:
    168
    My understanding is that the Cloud Build team is working on upgrading the Xcode Cloud Build to 11.4.
     
    KingOfSnake likes this.
  5. unity_carlos_vasquez

    unity_carlos_vasquez

    Joined:
    May 24, 2020
    Posts:
    2
    Hello, I wanted to try ARKit 3.5 support, so tried building using AR Foundation 4.0 Preview 3 + AR Subsystem 4.0 Preview 3 + ARKit XR Plugin 4.0 Preview 3 on Unity 2019.3.14f1 compiled for iOS using Xcode 11.2.1.

    The app builds fine, but then AR did not work. The Camera Permission did not popup, and when tried to set it manually the application is not listed in Settings-Privacy-Camera. Tried forcing the Camera Permission (Application.RequestUserAuthorization(UserAuthorization.WebCam)) but did not work.

    Then, rolled back to 3.1.3 and it worked fine, however I undestand that would not be using ARKIT 3.5 and no Occlussion support.

    Any recommendations?
    Thanks!
     
  6. adrian-taylor09

    adrian-taylor09

    Joined:
    Dec 22, 2016
    Posts:
    40
    Hey guys, great work with Version 4 preview.

    I'm trying to use my own material to render the mesh. However if I use any kind of texture with my material, the mesh renders a solid black color.

    This occurs with or without the Texture Coordinates checkbox set on the ARMeshManager.

    Is there something specific I need to do within my shader to make this work?
     
  7. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    220
    I'm having the exact same issue, I made a post about it here: https://forum.unity.com/threads/transparent-shader-not-working-on-ar-mesh-ipadpro.899066/

    I believe it's due to the UV's on any AR Mesh not being generated correctly, even if the option UV is toggled on on the ARMeshManager. I've submitted a bug report, but their reply was clueless. I wish there was a better way to get S*** fixed quickly in Unity! Forums and bug reports are both too slow!
     
  8. adrian-taylor09

    adrian-taylor09

    Joined:
    Dec 22, 2016
    Posts:
    40
    Thats what I figured. Have you tried saving some of the mesh data to a file somewhere, then debugging in the editor?
    That was going to be my next step.
     
  9. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    220
    No I haven’t, that’s a good idea though.
    The next solution I was gonna try was basically making a shader that uses world space UV’s instead of the real ones. But certain effects are still not possible until the UVs are generated properly. For example faded edges etc...

    What I (and probably 95% of AR devs) want from Unity by default is one shader for AR Planes and Meshes that can do all of the following at the same, and also toggle each on and off at any time via shader parameters:
    1. render shadows without rendering the surface
    2. block shadows without casting shadows (stop shadows from going through doubled over surfaces aka a table over the floor)
    3. Occlude itself
    4. render a repeating texture but have soft faded edges near the edge (a previous shader in the ARCore package did this)

    the issue is that a lot of AR devs out there don’t have the advanced shader writing skill to make one shader that can do all this performantly. Why make everyone repeat that work or be forced to hire a shader programmer? When they could just be scripting their app.
     
    Christin2015 likes this.
  10. vmnetcetera

    vmnetcetera

    Joined:
    Oct 5, 2018
    Posts:
    3
    I've also run into this issue. Enabling Texture Coordinates(or the rest of the properties) in ARMeshManager doesn't work for me either. Are there any news on this?
     
  11. todds_unity

    todds_unity

    Unity Technologies

    Joined:
    Aug 1, 2018
    Posts:
    168
    As I mentioned in the original post, only the three following ARMeshManager settings affect the ARKit implementation:
    • MeshPrefab
    • Normals
    • ConcurrentQueueSize
    ARKit does not support any of the other possible features exposed by ARMeshManager. ARKit does not provide Texture Coordinates, for example.
     
  12. edwon

    edwon

    Joined:
    Apr 24, 2011
    Posts:
    220
    Wow, ok I missed that detail. In that case, it would be nice to have a little note there in the inspector next to each feature that says like "currently Android/Magic Leap only" or something like that

     
    ElasticSea and Blarp like this.
  13. nhan_megatran

    nhan_megatran

    Joined:
    Sep 13, 2016
    Posts:
    1
    eco_bach likes this.
  14. LinaEagle

    LinaEagle

    Joined:
    Mar 15, 2018
    Posts:
    3
    Hello, can we save any scan data or mesh object via ARFoundation?

    Thank you in advance!
     
  15. VictorChow_K

    VictorChow_K

    Joined:
    Jan 16, 2019
    Posts:
    9
    @todds_unity -- I've been testing ARKit / ARMeshManager uptime and found some issues worth noting:
    1. Meshing has a memory growth rate of at least 5MB/minute (typically 10-20MB/minute), even if the device is in a fixed position+orientation viewing an unchanging physical location.
    2. In the same fixed position as (1), the generated meshes tend to grow incrementally larger over time in terms of vertex count. Approximately 1-20 vertices per mesh, per minute.
    3. ARMeshManager.DestroyAllMeshes does not reclaim any memory caused by (1) or reduce vertex count as seen in (2).
    4. Despite the memory growth rate and consuming over 4.3GB RAM, the process memory limit on an iPad Pro 4th Gen has never been hit in testing. After 15-30 minutes, the app will 100% hang in Unity's main thread. The Xcode debugger callstack almost always points to a semaphore wait, often within retrieval of ARSession currentFrame. Example call stack below. Other times it will just be in Unity platform-agnostic API wrapping a semaphore "acquire" implementation.
    5. 1, 2, 4 do not occur if ARMeshManager is disabled (no memory growth/leak, no inevitable hang).
    Setup:
    • Unity 2019.4.7f1
    • ARFoundation 4.1.0-preview.6
    • ARKit XR Plugin 4.1.0-preview.6
    • Xcode 12 beta 3
    • iPadOS 14 beta 3
    • iPad Pro 12.9" 4th Generation
    I will test again with the latest iPadOS 14 and Xcode 12 beta release (beta 5 at time of post) but wanted to provide some comprehensive initial feedback.
    Sample native Xcode callstack:

    Code (CSharp):
    1. Thread 1 Queue : com.apple.main-thread (serial)
    2. #0    0x000000019839eecc in semaphore_wait_trap ()
    3. #1    0x0000000198267454 in _dispatch_sema4_wait ()
    4. #2    0x0000000198267aec in _dispatch_semaphore_wait_slow ()
    5. #3    0x00000001d8851cb0 in -[ARSession currentFrame] ()
    6. #4    0x000000010945ba20 in -[SessionProvider afterUpdate] at /Users/todd.stinson/Work/arfoundation/com.unity.xr.arkit/Source~/UnityARKit/SessionProvider.m:291
    7. #5    0x000000010a50dca0 in VirtActionInvoker2<XRSessionUpdateParams_tAA765EB179BD3BAB22FA143AF178D328B30EAD16, Configuration_t47C9C15657F4C18BE99ACC9F222F85EB9E72BF43>::Invoke(unsigned int, Il2CppObject*, XRSessionUpdateParams_tAA765EB179BD3BAB22FA143AF178D328B30EAD16, Configuration_t47C9C15657F4C18BE99ACC9F222F85EB9E72BF43) [inlined] at <redacted>/Classes/Native/Unity.XR.ARSubsystems1.cpp:116
    8. #6    0x000000010a50dc80 in ::XRSessionSubsystem_Update_m40F8405ECB47FDC56B0B203F09655E3E5F637EFB(XRSessionSubsystem_t9B9C16B4BDB611559FB6FA728BE399001E47EFF0 *, XRSessionUpdateParams_tAA765EB179BD3BAB22FA143AF178D328B30EAD16, const RuntimeMethod *) at <redacted>/Classes/Native/Unity.XR.ARSubsystems1.cpp:13669
    9. #7    0x000000010a4ec3e4 in ::ARSession_Update_m5F845E6E9DACEF91167155BA894CBA73AFB5BED6(ARSession_tFD6F1BD76D4C003B8141D9B6255B904D8C5036AB *, const RuntimeMethod *) at <redacted>/Classes/Native/Unity.XR.ARFoundation1.cpp:26165
    10. #8    0x00000001080bfd60 in RuntimeInvoker_TrueVoid_t22962CB4C05B1D89B55A6E1139F0E87A90987017(void (*)(), MethodInfo const*, void*, void**) at <redacted>/Classes/Native/Il2CppInvokerTable.cpp:55531
    11. #9    0x00000001094024e8 in il2cpp::vm::Runtime::Invoke(MethodInfo const*, void*, void**, Il2CppException**) at /Users/builduser/buildslave/unity/build/External/il2cpp/il2cpp/libil2cpp/vm/Runtime.cpp:545
    12. #10    0x0000000108b51ba0 in ::scripting_method_invoke() at /Users/builduser/buildslave/unity/build/Runtime/ScriptingBackend/Il2Cpp/ScriptingApi_Il2Cpp.cpp:285
    13. #11    0x0000000108b5fcf4 in ::Invoke() at /Users/builduser/buildslave/unity/build/Runtime/Scripting/ScriptingInvocation.cpp:273
    14. #12    0x0000000108b6dba4 in Invoke [inlined] at /Users/builduser/buildslave/unity/build/Runtime/Scripting/ScriptingInvocation.h:68
    15. #13    0x0000000108b6db90 in CallMethodIfAvailable [inlined] at /Users/builduser/buildslave/unity/build/Runtime/Mono/MonoBehaviour.cpp:424
    16. #14    0x0000000108b6db60 in ::CallUpdateMethod() at /Users/builduser/buildslave/unity/build/Runtime/Mono/MonoBehaviour.cpp:528
    17. #15    0x00000001086f6f18 in UpdateBehaviour [inlined] at /Users/builduser/buildslave/unity/build/Runtime/GameCode/Behaviour.cpp:178
    18. #16    0x00000001086f6f0c in ::CommonUpdate<BehaviourManager>() at /Users/builduser/buildslave/unity/build/Runtime/GameCode/Behaviour.cpp:156
    19. #17    0x00000001086f6de8 in ::Update() at /Users/builduser/buildslave/unity/build/Runtime/GameCode/Behaviour.cpp:173
    20. #18    0x00000001088e34c8 in ::Forward() at /Users/builduser/buildslave/unity/build/Runtime/Misc/Player.cpp:1461
    21. #19    0x00000001088da4e0 in ::ExecutePlayerLoop() at /Users/builduser/buildslave/unity/build/Runtime/Misc/PlayerLoop
     
  16. VictorChow_K

    VictorChow_K

    Joined:
    Jan 16, 2019
    Posts:
    9
    ^ This still occurs with iPad OS 14 beta 5 + Xcode 12 beta 5.
     
  17. todds_unity

    todds_unity

    Unity Technologies

    Joined:
    Aug 1, 2018
    Posts:
    168
    The 4.1.0-preview.7 set of AR Foundation packages are available and fix the meshing memory leak issue.
     
    fumobox and VictorChow_K like this.
  18. artilla

    artilla

    Joined:
    Jul 8, 2017
    Posts:
    13
    Hello, I'm testing ar mesh samples and want to change the material of the mesh. When I changed the Material of RenderedMeshPrefab to Unity built-in Default-Material, the mesh has been rendered black. Is there anything I'm missing?
    Thanks in advance.
     
  19. Misnomer

    Misnomer

    Joined:
    Jul 15, 2013
    Posts:
    20
    @VictorChow_K Hey, thanks a lot for such a detailed bug report. I've been having this issue for quite some time. And in my experience it still persists in .7 preview. Have you tried testing it with the latest ARFoundation version? If yes and it was successful can you please share what version of Unity did you use.
     
  20. VictorChow_K

    VictorChow_K

    Joined:
    Jan 16, 2019
    Posts:
    9
    Hi - I still saw the hang in .7 preview.
     
    unearthly85 and Misnomer like this.
  21. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    556
    Misnomer likes this.
  22. unearthly85

    unearthly85

    Joined:
    Jul 30, 2019
    Posts:
    1
    We have seen it likewise, although it seems to be somewhat different, in that instead of "freezing", the app would crash completely and land you back to the iOS home screen.
     
  23. todds_unity

    todds_unity

    Unity Technologies

    Joined:
    Aug 1, 2018
    Posts:
    168
    The ARKit mesh hang has been fixed in 4.1.0-preview.9.
     
    Misnomer likes this.
  24. Misnomer

    Misnomer

    Joined:
    Jul 15, 2013
    Posts:
    20
    Thanks a lot! It finally works and I didn't have any lags sсanning pretty big building (290m^2 and ~280 meshes). And managed to export them in obj format too.
     
    Blarp likes this.
  25. Misnomer

    Misnomer

    Joined:
    Jul 15, 2013
    Posts:
    20
    @todds_unity Could you say something about meshing limitations in terms of max number of polygons and / or meshes? At least a rough approximation.
     
  26. todds_unity

    todds_unity

    Unity Technologies

    Joined:
    Aug 1, 2018
    Posts:
    168
  27. pwshaw

    pwshaw

    Joined:
    Sep 16, 2014
    Posts:
    13
    Is there a way to place a prefab onto the generated mesh? Something similar to the "Place on Plane" script that works with Plane Detection? Any suggestions?
     
    Blarp likes this.
  28. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    556
    Just made a quick example script for you :)
    Code (CSharp):
    1. using System.Collections.Generic;
    2. using JetBrains.Annotations;
    3. using UnityEngine;
    4. using UnityEngine.Assertions;
    5. using UnityEngine.XR.ARFoundation;
    6.  
    7.  
    8. namespace ARFoundationRemoteExample.Runtime {
    9.     public class PlacePrefabOnMesh : MonoBehaviour {
    10.         [SerializeField] ARMeshManager meshManager = null;
    11.         [SerializeField] ARSessionOrigin origin = null;
    12.         [CanBeNull] [SerializeField] GameObject optionalPointerPrefab = null;
    13.         [SerializeField] bool disableObjectsOnTouchEnd = false;
    14.  
    15.         readonly Dictionary<int, Transform> pointers = new Dictionary<int, Transform>();
    16.  
    17.  
    18.         void Update() {
    19.             for (int i = 0; i < Input.touchCount; i++) {
    20.                 var touch = Input.GetTouch(i);
    21.                 var pointer = getPointer(touch.fingerId);
    22.                 var touchPhase = touch.phase;
    23.                 if (touchPhase == TouchPhase.Ended || touchPhase == TouchPhase.Canceled) {
    24.                     if (disableObjectsOnTouchEnd) {
    25.                         pointer.gameObject.SetActive(false);
    26.                     }
    27.                 } else {
    28.                     var ray = origin.camera.ScreenPointToRay(touch.position);
    29.                     var hasHit = Physics.Raycast(ray, out var hit, float.PositiveInfinity, 1 << meshManager.meshPrefab.gameObject.layer);
    30.                     if (hasHit) {
    31.                         pointer.position = hit.point;
    32.                         pointer.rotation = Quaternion.FromToRotation(Vector3.up, hit.normal);
    33.                     }
    34.  
    35.                     pointer.gameObject.SetActive(hasHit);
    36.                 }
    37.             }
    38.         }
    39.  
    40.         Transform getPointer(int fingerId) {
    41.             if (pointers.TryGetValue(fingerId, out var existing)) {
    42.                 return existing;
    43.             } else {
    44.                 var newPointer = createNewPointer();
    45.                 pointers[fingerId] = newPointer;
    46.                 return newPointer;
    47.             }
    48.         }
    49.  
    50.         Transform createNewPointer() {
    51.             var result = instantiatePointer();
    52.             Assert.AreNotEqual(result.gameObject.layer, meshManager.meshPrefab.gameObject.layer, "Pointer layer should not be the same as the mesh prefab layer");
    53.             result.parent = transform;
    54.             return result;
    55.         }
    56.  
    57.         Transform instantiatePointer() {
    58.             if (optionalPointerPrefab != null) {
    59.                 return Instantiate(optionalPointerPrefab).transform;
    60.             } else {
    61.                 var result = GameObject.CreatePrimitive(PrimitiveType.Sphere).transform;
    62.                 result.localScale = Vector3.one * 0.05f;
    63.                 result.gameObject.layer = LayerMask.NameToLayer("Ignore Raycast");
    64.                 return result;
    65.             }
    66.         }
    67.     }
    68. }
    69.  
     
    Last edited: Oct 20, 2020
  29. pwshaw

    pwshaw

    Joined:
    Sep 16, 2014
    Posts:
    13
    Nice! Thanks KirillKuzyk!
     
    KirillKuzyk likes this.
  30. O_Monteiro

    O_Monteiro

    Joined:
    Jul 7, 2015
    Posts:
    3
    Hey guys! Great job! Been messing with this and it's amazing.

    Any idea of how I can combine meshing classification data with planeManager data?
    I wanted to be able to texture the floor mesh that I get with classification the same way I can texture a feathered plane. However using the feathered plane I have no occlusion and changing the classification floor mesh prefab I don't have a properly repeated texture. Any thoughts on how to do this?
    Should I use the occlusion meshing sample mixed with the feathered plane one?
    Thanks for any reply,
    Cheers!
     
  31. mkrfft

    mkrfft

    Joined:
    Sep 8, 2017
    Posts:
    10
    How did you go about exporting the data to OBJ? I have been trying for a few days but only get a random blob in the export.


     
    Blarp likes this.
  32. Blarp

    Blarp

    Joined:
    May 13, 2014
    Posts:
    228
    Just got the new iphone, looking to export some meshes myself.

    obj, gltf, usdz, colors, erase meshes around an object you intended to capture.

    That would pretty much rock our socks off with the lidar scanner.

    Just scanning and sharing the models will be a new form of social media that will definitely take off. Easy acquisition project.
     
  33. BinaryBanana

    BinaryBanana

    Joined:
    Mar 17, 2014
    Posts:
    72
    I hope this is not too trivial question but how can I get colors or texture for that mesh? Is there a sample for that?

    ARMesManager has that "colors" option to request color per each vertex. I took NormalMeshes sample, I deselected Normals on ARMeshManager and selected colors and textureCoordinates. I swapped shader to unlit (see below) and mesh is white now :)

    I am not a shader expert so it's probably me missing something or I wrongly understand that "colors" option on ARMeshManager.

    My poor shader:
    Code (CSharp):
    1. Shader "Custom/UnlitVertexColor" {
    2.     Properties{
    3.     }
    4.  
    5.     SubShader{
    6.         Pass {
    7.             CGPROGRAM
    8.  
    9.             #pragma vertex vert
    10.             #pragma fragment frag
    11.  
    12.             struct IN
    13.             {
    14.                 float4 pos : POSITION;
    15.                 float4 color : COLOR;
    16.             };
    17.  
    18.             struct v2f
    19.             {
    20.                 float4 pos : SV_POSITION;
    21.                 float4 color : COLOR;
    22.             };
    23.  
    24.             v2f vert(IN input) {
    25.                 v2f o;
    26.                 o.pos = UnityObjectToClipPos(input.pos);
    27.                 o.color = input.color;
    28.                 return o;
    29.             }
    30.  
    31.             fixed4 frag(v2f i) : COLOR {
    32.                 return i.color;
    33.             }
    34.  
    35.             ENDCG
    36.         }
    37.     }
    38. }
    I am on Unity 2020.1.11
     
  34. holykiller

    holykiller

    Joined:
    Nov 1, 2013
    Posts:
    10
    Hello @todds_unity The generated meshes doesn't have any color or texture any idea when this will implemented or a work around it? the only option i found was to work with the vuforia sdk (but they ask to have a scanner...) but then again i would like to do everything with unity...
     
    Last edited: Nov 18, 2020
  35. ogoguel

    ogoguel

    Joined:
    Nov 22, 2015
    Posts:
    2
    Found some annoying behaviour in ARPlaneManager : if you register to the planeChanged event, the application is able to maintain a list in sync with the trackables properties. Disabling the manager will stop the event to be called as the planes are not refreshed anymore. Fine.

    Now, if planes were being processed at the precise time you disable the manager, they are going to be added to the trackables anyway, but the event for that new plane won't be triggered. It means that the application is out of sync, and there's no way to recover (but do some polling on the trackables - this is what I do as a temp hack).

    Any thought on this? (BTW Using ARKit 4.1.1 & Unity 2020.1.4f1)
    Olivier
     
  36. tdmowrer

    tdmowrer

    Unity Technologies

    Joined:
    Apr 21, 2017
    Posts:
    602
    On iOS, ARKit informs us of new planes in an asynchronous callback, so it is possible for it to detect a new plane after you have disabled the plane manager. However, ARFoundation would not "know" about it until after you re-enabled the plane manager. The logic in the ARPlaneManager that updates the list of trackables happens just before it triggers the planesChanged event (and both happen in its Update method), so there should be no way for a plane to exist in the trackables list without having received an added event. Pseudocode:

    Code (CSharp):
    1. void Update()
    2. {
    3.     GetAllChangesSinceTheLastUpdate();
    4.     UpdateTrackablesList();
    5.     if (anythingHasChanged)
    6.     {
    7.         FireEvent();
    8.     }
    9. }
    If I understand correctly, you are saying that a plane is present in the trackables list, but you are not notified via the planesChanged event until after you re-enable the plane manager -- is that right?

    One thing to consider: if one of the subscribers to planesChanged throws an exception, none of the other subscribers will receive a callback; could this be the case?
     
  37. ogoguel

    ogoguel

    Joined:
    Nov 22, 2015
    Posts:
    2
    I was trying to show/hide planes even after after having disabled the manager based on my "event" list, not on the trackable ones. But some planes could not be hidden, hence my assumption of the two lists out of sync.

    Obviously, AR Foundation manages that case pretty smoothly, so the error must be on my side.

    I have removed my hack but I have not run into the issue (even if the repro was not 100%, I've always managed to reproduce it before) : it is likely that, as you've suggested, an exception was thrown on my code.
    I've added a try/catch to log the error : if the problem was to happen again, I shall have more information to investigate.

    Thanks for your time.
     
    tdmowrer and todds_unity like this.
  38. mfuad

    mfuad

    Unity Technologies

    Joined:
    Jun 12, 2018
    Posts:
    305
    Hey everyone, this thread has aged gracefully so we're going to close it. If you have any additional questions/issue, feel free to post a new thread and the team will be more than happy to take a look. Cheers!
     
Thread Status:
Not open for further replies.
unityunity