Search Unity

ARKit support for iOS via Unity-ARKit-Plugin

Discussion in 'ARKit' started by jimmya, Jun 5, 2017.

  1. Pilltech101

    Pilltech101

    Joined:
    Jul 25, 2014
    Posts:
    13
    Please I really need help with this
     
  2. Ross_S

    Ross_S

    Joined:
    Jul 10, 2012
    Posts:
    29
    Hello .... So.... I'm running ARKit and it's all working just fine except that the planes it's detecting are not sitting right on the surfaces.... they seem to be offset up in the air by about half a meter... ? ... there's no position data in the plane prefab used by generate planes... and hmmm... just wondered if anybody had a clue? Thanks
    EDIT: My bad... the AR Camera had a translation on it in the scene... duh.
     
    Last edited: Dec 2, 2018
    jimmya likes this.
  3. dorukeker

    dorukeker

    Joined:
    Dec 6, 2016
    Posts:
    30
    Thanks for the suggestions. Here is some more info and also the solution we decided to apply.
    The project is already in the Gamma color space. We ended up modifying the default YUVMaterial's shader and add brightness control to it. (The shader is attached for anyone who needs it.)

    On top of this we added PostProcessing effects to the AR Camera to make the over all color brighter and warmer color temperature.

    A little note on the project: This project is made for an exhibition space. Meaning the app will be used in the same space and with same lightning conditions all the time. This helps to apply a solution like the above. Hope this helps someone.

    Cheers,
    Doruk
     

    Attached Files:

    jimmya likes this.
  4. karsnen

    karsnen

    Joined:
    Feb 9, 2014
    Posts:
    65
    An issue while trying to reset a session

    Scenario

    If there is any session interruption, have to restart the session. Hence I destroy/disable planes & point cloud data. The issue comes in with one event.

    Session Configuration

    Code (CSharp):
    1.     internal ARKitWorldTrackingSessionConfiguration currentConfiguration
    2.         {
    3.             get
    4.             {
    5.                 ARKitWorldTrackingSessionConfiguration _configuration = new ARKitWorldTrackingSessionConfiguration
    6.                 {
    7.                     planeDetection = UnityARPlaneDetection.Horizontal,
    8.                     alignment = UnityARAlignment.UnityARAlignmentGravity,
    9.                     getPointCloudData = true,
    10.                     enableLightEstimation = true,
    11.                     enableAutoFocus = true,
    12.  
    13.                     maximumNumberOfTrackedImages = 0,
    14.                     environmentTexturing = UnityAREnvironmentTexturing.UnityAREnvironmentTexturingNone;
    15.                 };
    16. }
    17. }
    Session being reset is as follows

    Code (CSharp):
    1. UnityARSessionNativeInterface.GetARSessionNativeInterface().Pause();
    2.  
    3. // Does nothing but destroys the planes in the LinkedListDictionary
    4. GameObject.FindObjectOfType<InhouseAR.PlaneController>().DestroyExistingPlanes();
    5.  
    6. // Disables the prefab's instantiate in response to the point cloud data on ARFrameUpdatedEvent
    7. GameObject.FindObjectOfType<PointCloudController>().RestartPointCloudData();
    8.  
    9. var _configuration = CurrentSessionConfiguration.currentConfiguration;//sessionConfiguration;
    10. if (_configuration.IsSupported == true)
    11. {
    12.       thisWorldMappingSession.RunWithConfig(_configuration);
    13. }

    Issue

    Code (CSharp):
    1. public static event ARAnchorAdded ARAnchorAddedEvent;
    2. public static event ARAnchorUpdated ARAnchorUpdatedEvent;
    3. public static event ARAnchorRemoved ARAnchorRemovedEvent;
    4.  

    Question

    In above events, after I disable the planes - I keep getting update anchor event. I would like to know
    1. How to start getting ARAnchorAdded event as a new anchor has been established?
    2. At what scenario would ARAnchorRemoved event be called?
    Probably Conclusion
    My understanding so far is that it is not possible to control
    ARAnchorAdded
    but to restart the entire unity scene for a new anchor's to be added.
     
  5. mkusan

    mkusan

    Joined:
    Nov 22, 2016
    Posts:
    7
    Hello!

    Did anyone try and succeed saving the PNG of Cubemap faces? I'd like to extract the cubemap that ARKit generates with ML and save it to the app data. I can only get it to save 6 black photos, and I get "Texture '' has no data" in the console log. I put this code in the ReflectionProbeGameObject.cs from the UnityAREnvirontmentTexture example.

    Code (CSharp):
    1. Texture2D tex = new Texture2D (latchedTexture.width, latchedTexture.height, TextureFormat.RGB24, false);
    2. CubemapFace[] faces = new CubemapFace[] {
    3.         CubemapFace.PositiveX, CubemapFace.NegativeX,
    4.         CubemapFace.PositiveY, CubemapFace.NegativeY,
    5.         CubemapFace.PositiveZ, CubemapFace.NegativeZ};
    6. try {
    7. foreach (CubemapFace face in faces) {
    8.         tex.SetPixels (latchedTexture.GetPixels (face));
    9.         tex.Apply ();
    10.         byte[] data = tex.EncodeToPNG ();
    11.         if (data != null) {
    12.             File.WriteAllBytes (Application.persistentDataPath + "/" + latchedTexture.name + face.ToString () + ".png", data);
    13.              
    14.     }
    15. } catch {
    16.     Debug.Log ("Error");
    17. }
     
    Burglecut likes this.
  6. cam415

    cam415

    Joined:
    Mar 26, 2014
    Posts:
    6
    @jimmya - Is it possible to track multiple images simultaneously? In UnityARCameraManager.cs there is only an ARKitWorldTrackingSessionConfig which according to apple's documentation can only track one image at a time where as ARImageTrackingConfiguration can track up to 2 images simultaneously. Does unity have a configuration for ARImageTracking instead of ARWorldTracking?
     
  7. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    792
    On the page you linked above, it shows that you can specify in the ARKitWorldTrackingSessionConfig how many simultaenous images you want to track. "maxSimultaenousTrackedImages". I believe you can specify quite a few and are not limited to 2.
     
  8. Futurristic

    Futurristic

    Joined:
    Jun 21, 2016
    Posts:
    29
    I'm getting this error -
    Undefined symbols for architecture arm64:

    "_OBJC_CLASS_$_ARWorldMap", referenced from:

    objc-class-ref in UnityARKit.a(WorldMapManager.o)

    while doing a build in Xcode. pls help. thx
     
  9. dgifgghh

    dgifgghh

    Joined:
    Oct 29, 2018
    Posts:
    2
    Hey.
    Looks like this advice is outdated. After reloading the scene, all previously found "debugPlane" are not re-displayed. How do i fix this? I need to reload the ARKit scene and scan again.
     
  10. williamdarma

    williamdarma

    Joined:
    Nov 7, 2015
    Posts:
    2
    hello guys, this is maybe one of the stupid question.......
    i'm currently working on project that using unityARWorldMap.......
    usually i record the mapping, and then save the map....
    then i tried to load the saved map, it successfully loaded.
    like visually, you can see the recorded blue map appear on your device,when you successfully loaded
    but is there any indicator in script so we can know when the map already loaded?(like bool or something)
    thanks
     
  11. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    792
    You can look for ARWorldMapStatus - it should go from relocalizing to mapped.
     
    Burglecut likes this.
  12. daviner

    daviner

    Joined:
    Feb 14, 2018
    Posts:
    10
    Looking back at your original request it looks like you wanted to clear existing anchors when reloading the scene using RunOptions will remove the anchors and that is why your debugPlane is not re-displayed. Do you mean by reload as ARKit scene to rescan the load the last planes? That would mean saving and loading the map if that is what you want to do look at the WorldMaps scenes or could you please expand on your point?
     
  13. AmarBagh

    AmarBagh

    Joined:
    Sep 27, 2015
    Posts:
    4
    Hey Guys, I'm working on this simple (for now) AR project. My client wants me to use ARKit. I took a picture of a brick wall with a dress on it and am using it as the image anchor.

    I tested the diffrence between Vuforia and ARKit sdks.

    ARKit has a hard time recognizing it (stuttering, resizing, etc.), but Vuforia seems to work perfectly. I'm using the basic image anchor scene, and a cube as the generated prefab.

    Does anyone have experience with this? Is Vuforia better at image tracking? I know a picture of a space is unconventional, but if it works in Vuforia shouldn't it work in ARKit?
     
  14. daviner

    daviner

    Joined:
    Feb 14, 2018
    Posts:
    10

    ARKit is really good at image tracking as good if not better then Vuforia so not sure with your results? This seems pretty weird since you are using the ARKit's sample scene, have you tried it with the sample image anchor texture and tried testing that it could be your image that might be affecting tracking although not sure why it would

    For your image could you check
    1. if your reference image is added to your reference image set?
    2. in terms of resizing your AR reference image has the right physical size?
    3. in terms of stuttering ensure your ARAnchorUpdate() is working correctly like the sample scene has
     
  15. Adam_Chernick

    Adam_Chernick

    Joined:
    Feb 18, 2016
    Posts:
    1
    Do we still handle this the same way? Seems like a common option that will need to be toggled. Thanks for the help!
     
  16. AmarBagh

    AmarBagh

    Joined:
    Sep 27, 2015
    Posts:
    4
    1. My reference image is added to the image set. All I did is swap the white logo for my personalized anchor to keep it easy.
    2. My reference image is set to 1 in unity, then the appropriate size in Xcode before I build.
    3. I am using the sample scene without changing ARAnchorUpdate().

    Could it be something with my import settings? I'm using a .jpg, texture type default, and everything else seems fine in the settings.

    Lastly, My ARImageSet is set up appropriately. It doesn't need to sit in the scene right?

    Also currently my image anchor is sitting in a folder directly under assets, does it need to be in the UnityARImageAnchor>ReferenceImages folder?

    Thanks so much for the reply.
     
  17. daviner

    daviner

    Joined:
    Feb 14, 2018
    Posts:
    10
    You do not have to set the size in Xcode you can do so in unity. Look at the images attached one shows the ARReferenceSubSet which the config will run, it has the Resource Group Name set and includes 1 element in the Reference Images list, this is the ImageReferenceImage (bad naming I know) inside this you can specify the size of the image. The other attached image shows this it has 3 properties public to edit in the inspector Image Name, Image Texture and Physical Size. Make sure

    1. The image texture when imported in unity is set as Sprite (2D and UI).
    2. The image name and image texture both have the save name as this is used for string comparison ARAnchor.cs. The functionality in your project seems to work so from a technically point of view if your project is setup as such it should work.

    Your tracking problem may be physical size of the marker entered incorrectly I would debug and log the ARAnchorUpdate function to make sure it is logged every frame by doing so you can see how frequently the image is being tracked. I haven't looked at this in a while so you could check to ensure in the UpdateImageAnchor() you force the image to be tracked every frame by adding the ReferenceImage as a public var and changing your UpdateImageAnchor() to...

    void UpdateImageAnchor(ARImageAnchor arImageAnchor)
    {
    if (arImageAnchor.referenceImageName == ReferenceImage.imageName)
    {
    count++;
    Debug.Log("updating image " + count );
    myImageObject.transform.position = UnityARMatrixOps.GetPosition(arImageAnchor.transform);
    myImageObject.transform.rotation = UnityAR MatrixOps.GetRotation(arImageAnchor.transform);
    }
    }
     

    Attached Files:

  18. daviner

    daviner

    Joined:
    Feb 14, 2018
    Posts:
    10
    @jimmya I continued working on this and now save my ar map on a server I do this by sending the serializableARWorldMap.arWorldMapData (I made this public in ARWorld.cs so I could access it) as it contains all the information for my ARMap. I encode to a string using Convert.ToBase64String(serializableARWorldMap.arWorldMapData) it saves perfectly fine. When loading my byte array from the server I parse using
    Convert.FromBase64String(encodedMapData)

    As my project is split into 2 apps 1 creator app (saving to server) 2 end-user app (loading from server). I load my old map data from the server and assign this to a new map.I do this by:

    serializableARWorldMap map = new serializableARWorldMap(oldMapData);
    camManager.RunConfigWithMap(map);

    By doing so many functions in the ARWorld.cs are called I see it working as so:

    1. My byte array is initialised when I call serializableARWorldMap(oldMapData);
    2. camManager.RunConfigWithMap starts a new session with my new map. It converts my serializableARWorldMap to ARWorldMap using the implicit operator
    3. This calls ARWorldMap.SerializeFromByteArray(serializableARWorldMap.mapData) to serialize my data into a array and handles memory
    4. This in return calls ARWorldMap (newMapPtr) which I do check to see if it is nulll, but it is not instead it is assigned to the global ptr value.

    I understand how this works and the issue is it fails to relocalize. TrackingState goes to relocalize and fails I wonder if my map is actually loaded correctly at all? I want to know why it fails relocalizing as I do not get any errors is this why my map Center and Extent values are incorrect? Why there are no blue planes appearing? I tried running map config with no plane detection and with plane detection still my last map planes didn't appear.
     
    Last edited: Jan 17, 2019
  19. daviner

    daviner

    Joined:
    Feb 14, 2018
    Posts:
    10
    Figured it out! my ARWorldMap data was not saving on the server correctly if you follow these steps it will work
     
  20. rjvromans

    rjvromans

    Joined:
    Oct 14, 2018
    Posts:
    5
  21. Jelmer123

    Jelmer123

    Joined:
    Feb 11, 2019
    Posts:
    39
    Hi

    When I'm using ARFoundation in a project and want to use an ARKit specific function for the iOS builds (AREnvironmentProbeAnchor), what is the right approach to do so?
    I don't see it in the ARKit package, so right now I also imported the ARKit plugin from bitbucket in my project. So now I have ARFoundation, ARKit XR package, ARCore XR package and the ARKit plugin. This seems wrong :D
     
  22. mavisakal

    mavisakal

    Joined:
    Aug 26, 2017
    Posts:
    15
    Can we mix ARKit and ARKit 2 features in the same scene. I mean I'm using UnityARKitHitTest in a scene and want to use object detection from ARKit 2 in the same scene? Is it possible?
     
  23. BShoyer

    BShoyer

    Joined:
    Sep 19, 2017
    Posts:
    3
    I am interested in creating a new ARReferenceImage at runtime or modifying an existing one using a screenshot, i have checked a few forums and it seems this is possible natively for arkit and found a few hacky ways to make it work using the Unity plugin but only for 1.5. Is this possible to accomplish with the current Unity plugin or is this a goal to incorporate?
    - Thank You
     
  24. IanSMV

    IanSMV

    Joined:
    May 30, 2015
    Posts:
    1
    Anyone had any luck with this? Tried everything I can think of
     
  25. rjvromans

    rjvromans

    Joined:
    Oct 14, 2018
    Posts:
    5
    Hi guys,

    My previous post:
    https://forum.unity.com/threads/ark...nity-arkit-plugin.474385/page-58#post-4209892

    Was due to the fact I added an assembly definition file to the ARKit plugin.

    After removing this the examples started working again. Phew:)

    But,

    I still need this assembly definition, since I want to reference the ARKit plugin from my own assembly. How are you guys working around this? o_O

    Another person ran into the same issue:

    https://forum.unity.com/threads/arkit-plugin-assembly-definition-files-broken-xcode-project.541306/

    Please help

    Best regards,

    Ronald
     
    Last edited: Mar 6, 2019
  26. petitbas

    petitbas

    Joined:
    Sep 9, 2014
    Posts:
    4
    I don't have access to ARWorldMapStatus. Do you mean "ARWorldMapRequestStatus"? sessionSubsystem.GetWorldMappingStatus()?
     
  27. zyonneo

    zyonneo

    Joined:
    Apr 13, 2018
    Posts:
    218
  28. daviner

    daviner

    Joined:
    Feb 14, 2018
    Posts:
    10
    No I think he meant ARWorldMappingStatus which is an enum that will tell you what the current status of mapping is. It can be: 1. ARWorldMappingStatusNotAvailable 2. ARWorldMappingStatusLimited 3. ARWorldMappingStatusExtending 4. ARWorldMappingStatusMapped. This can be accessed every AR frame to check what your mapping status is, have a look at the UpdateWorldMappingStatus.cs.

    For your issue I would call subscribe to the event frame update event by calling UnityARSessionNativeInterface.ARFrameUpdatedEvent += CheckWorldMapStatus; when loading and running your map. This way you can see it in your code that it has relocalised as ARWorldMappingStatusMapped will appear. I use a different way by adding a ARUserAnchor to your ARObject before you save the map and when you load it call UnityARSessionNativeInterface.ARUserAnchorAddedEvent += MapAnchorAdded; to see in your code the object is detected.
     
  29. thorikawa

    thorikawa

    Joined:
    Dec 3, 2013
    Posts:
    6
    I have a question about ARKitRemote. Is there any way to see the *processed* frame on my phone? Currently I only see the raw camera image on my phone and the processed frame appears on PC (Unity Editor), but I would like to check the actual game scene on my phone to check how users feel under the real situation. Is it possible?
     
  30. yuta_yoshida_ts

    yuta_yoshida_ts

    Joined:
    Dec 11, 2017
    Posts:
    1
    Hello

    I have a question about the camera image of iPhone that can be obtained with ARKit.
    We want to use AR with the camera's viewing angle narrowed a little further, but it is possible to change the viewing angle of the iPhone's camera to match the viewing angle of Unity's camera through ARKit Is it?

    Sorry for your inconvenience but please answer.
     
  31. smeagols

    smeagols

    Joined:
    Oct 5, 2013
    Posts:
    34
    Hi,
    There is any way to do zoom on AR Camera?
    I'm using UnityARImageAnchor Scene and I need to detect images at 2 meters and i don't know if it's possible do zoom in to get the image bigger and can be tracked.

    Thanks you.
     
  32. zyonneo

    zyonneo

    Joined:
    Apr 13, 2018
    Posts:
    218
    Is Object recognition supported from an FBX model or CAD drawing in Arkit? Can we convert FBX or any 3d model int ARreference Object(.arobject)? .arobject is taken from scanning feature points. :( .So what if we have only an FBX or cad drawing?
     
  33. pedrobarrosbrasil

    pedrobarrosbrasil

    Joined:
    Apr 4, 2019
    Posts:
    8
    Hello guys, I think I need some help here...
    The app I'm working on is a 3 scenes app.
    One is an Animoji like experience, using the face tracking abilities the plugin offers.
    The second one is an image tracking experience (much like Vufoira`s but using arkit)
    the third one would be something like a gallery.

    The problem I'm stuck with is changing between the scenes. I want it to be as smooth as possible but I'm getting crashes switching back and forth the two AR experiences and 100% crash while trying to load the "Gallery" scene (which is empty for now).
    I've read a few pages of this thread and saw that other people had similar issues. But never found a solution that worked for me.

    I get this from xCode:

    Unloading 0 Unused Serialized files (Serialized files now loaded: 0)


    UnloadTime: 1.793458 ms

    System memory in use before: 19.0 MB.

    System memory in use after: 18.5 MB.



    Unloading 13 unused Assets to reduce memory usage. Loaded Objects now: 665.

    Total: 1.108208 ms (FindLiveObjects: 0.209708 ms CreateObjectMapping: 0.030208 ms MarkObjects: 0.527375 ms DeleteObjects: 0.339750 ms)



    libc++abi.dylib: terminating with uncaught exception of type Il2CppExceptionWrapper


    Also, I'm new to programming, any feedback is appreciated.
    Unity Version: 2018.3.9f1
    xCode: Version 10.1 (10B61)
    device iOS version: iPhone Xr 12.1.2
    ARkit plugin version: latest from bitbucket
     
    karsnen and Chris_1001 like this.
  34. Tarrag

    Tarrag

    Joined:
    Nov 7, 2016
    Posts:
    110
    Hi all,

    When running an example with ARkitRemote I'm getting constant stacktracing spam. I set Stack Tracing Log - All - None and all Logging in Player Settings to none but messages shows up, here's one line

    Autoconnected Player 1036800 vs 165077
    UnityEngine.XR.iOS.CompressionHelper:ByteArrayCompress(Byte[])
    UnityEngine.XR.iOS.UnityRemoteVideo:OnPreRender()
    (Filename: ./Runtime/Export/Debug/Debug.bindings.h Line: 48)

    It makes debugging looking at console with arkitremote on quite painful.

    I've looked throughout the posts but couldn't find a reference to a solution. I hope this is the right place to post this as it relates to ARKitRemote.

    Can someone please shed some light?

    Thank you for your help,

    Sergio
    on latest UnityARKit plugin, Unity 2019.1.0f1
     
    Last edited: Apr 16, 2019
  35. eco_bach

    eco_bach

    Joined:
    Jul 8, 2013
    Posts:
    1,353
    Hi
    Can anyone tell me why my default camera view in a deployed ARkit app is too light, washed out looking?
    Is this something I can change in Project Settings?
     
  36. Blarp

    Blarp

    Joined:
    May 13, 2014
    Posts:
    188
    Switch from gamma to linear...or opposite, i forget.
     
    jimmya likes this.
  37. jimmya

    jimmya

    Unity Technologies

    Joined:
    Nov 15, 2016
    Posts:
    792
  38. eco_bach

    eco_bach

    Joined:
    Jul 8, 2013
    Posts:
    1,353
  39. Chris_1001

    Chris_1001

    Joined:
    Jan 16, 2018
    Posts:
    10
     
  40. Chris_1001

    Chris_1001

    Joined:
    Jan 16, 2018
    Posts:
    10
  41. eco_bach

    eco_bach

    Joined:
    Jul 8, 2013
    Posts:
    1,353
    So what is everyone doing in regards to the Baked Lightmaps in Prefabs issue? There is a LONG thread here discussing this issue https://forum.unity.com/threads/problems-with-instantiating-baked-prefabs.324514/page-9

    Since by default ARFoundation expects a prefab to instantiate I am sure other people must have faced this issue.
    The only workaround I can come up with is to never Instantiate prefabs, instead simply have your prefab hidden and or invisible when you launch. Requires a simple modification to the PlaceOnPlane script.
     
    Last edited: Apr 22, 2019
  42. Blarp

    Blarp

    Joined:
    May 13, 2014
    Posts:
    188

    Ive completely removed the lighting and shadows from any and all AR projects and have the artist bake lighting and shadows in the texture(s). Ill mat swap based on time of day or whatever.

    Thats how I delt with that, lol.

    Lights bleed too much in AR and all kinds of annoying issues that I decided to never deal with again. Plus, my AR experiences aren't RPG large-scale enviros, usually a few models or small enviro.

    I have even put models and lights inside 3d boxes where no light is suppose to escape(shaders and whatnot), but thats not what happens.
     
    Last edited: Apr 22, 2019
  43. eco_bach

    eco_bach

    Joined:
    Jul 8, 2013
    Posts:
    1,353
    Thanks. Not the answer I was hoping for:) Will try my kludge and let you know how it goes.
     
    Blarp likes this.
  44. Blarp

    Blarp

    Joined:
    May 13, 2014
    Posts:
    188
    GODSPEED FELLOW ADVENTURER

     
  45. zyonneo

    zyonneo

    Joined:
    Apr 13, 2018
    Posts:
    218
    How to stop plane detection and remove detected planes after Worldmap is loaded in UnityARWorld Map.That is if relocalized I want to stop plane detection.How will I know whether it is Relocalized.

    Code (CSharp):
    1. void Start ()
    2. {
    3.     UnityARSessionNativeInterface.ARFrameUpdatedEvent += CheckWorldMapStatus;
    4.  
    5.  
    6.  
    7. // I have subscibed to a event
    8.  
    9.     UnityARSessionNativeInterface.ARFrameUpdatedEvent += CheckIfRelocalized;
    10. }
    11. void CheckWorldMapStatus(UnityARCamera cam)
    12. {
    13.     text.text = cam.worldMappingStatus.ToString ();
    14.     tracking.text = cam.trackingState.ToString () + " " + cam.trackingReason.ToString ();
    15. }
    16.  
    17.  
    18. void CheckIfRelocalized(UnityARCamera cam)
    19. {
    20.     if (cam.worldMappingStatus==ARWorldMappingStatus.ARWorldMappingStatusMapped &&cam.trackingState == ARTrackingState.ARTrackingStateNormal && cam.trackingReason==ARTrackingStateReason.ARTrackingStateReasonNone)
    21.     {
    22.  
    23.         Debug.Log("Plane Detection Stopped");
    24.  
    25.         StopPlaneDetection();
    26.     }
    27. }
    28.  
    29. void OnDestroy()
    30. {
    31.     UnityARSessionNativeInterface.ARFrameUpdatedEvent -= CheckWorldMapStatus;
    32.     UnityARSessionNativeInterface.ARFrameUpdatedEvent -= CheckIfRelocalized;
    33. }
    34.  
    35.  
    36. public void StopPlaneDetection()
    37. {
    38.     UnityARSessionNativeInterface session = UnityARSessionNativeInterface.GetARSessionNativeInterface();
    39.     ARKitWorldTrackingSessionConfiguration config = new ARKitWorldTrackingSessionConfiguration();
    40.     config.planeDetection = UnityARPlaneDetection.None;
    41.     config.enableLightEstimation = true;
    42.     //config.enableAutoFocus = true;
    43.     session.RunWithConfig(config);
    44. }
    45.  
    46.  
    47. }
    48.  
    When I click the load map then the Gameobjects appear immediately in front of screen before they relocalize at their proper places.
     
    Last edited: Apr 25, 2019
  46. eco_bach

    eco_bach

    Joined:
    Jul 8, 2013
    Posts:
    1,353
    Hi Jimmy. Earlier you mentioned "...change the Clear Material on the UnityARVideo component on the camera to YUVMaterialLinear (and leave ColorSpace as Linear)."

    Sorry, but even after a search cannot find any UnityARVideo component in my ARFoundation project. Was this possibly in an earlier version?
    A search turns it up on Github but this is from 1 year ago. Will this work with ARFoundation?

    https://github.com/Unity-Technologi...ssets/UnityARKitPlugin/Plugins/iOS/UnityARKit


    I am already in Linear color space. My placed 3d object is rendered fine. Its just the live video from the iPhone's camera that looks too light and washed out in comparison.
     
    Last edited: Apr 24, 2019
    Blarp likes this.
  47. Blarp

    Blarp

    Joined:
    May 13, 2014
    Posts:
    188

    I had the exact problem. I fixed it this way:

    Open:
    ARKitLWRPShader.shader

    Go all the way down to the last line.

    Do this:
    //gamma
    //return mul(ycbcrToRGBTransform, ycbcr);
    //linear
    return pow(mul(ycbcrToRGBTransform, ycbcr), 2.2);

    Dunno if thats a good idea or not, but it fixed it for me without noticeable repercussion.

    My linear is as crisp as can be now
     
    Last edited: Apr 25, 2019
  48. zyonneo

    zyonneo

    Joined:
    Apr 13, 2018
    Posts:
    218
    What If I am using ARworld map after relocalization I want to stop plane detection and remove already detected planes.When I used this code the AR Models are blinking in front of the cam
     
  49. Baraneedharan

    Baraneedharan

    Joined:
    Jan 10, 2019
    Posts:
    4
    Thank you s:)o Much.
     
  50. eco_bach

    eco_bach

    Joined:
    Jul 8, 2013
    Posts:
    1,353

    Thanks!. That's the ticket!!!
     
    Last edited: Apr 25, 2019
    Blarp likes this.