Search Unity

Resolved ARKit support for iOS via Unity-ARKit-Plugin

Discussion in 'AR' started by jimmya, Jun 5, 2017.

Thread Status:
Not open for further replies.
  1. guoyaxin

    guoyaxin

    Joined:
    May 23, 2015
    Posts:
    2
  2. HaimBendanan

    HaimBendanan

    Joined:
    May 10, 2016
    Posts:
    28
    Hey,

    When using the face data, how can i know the absolute rotation of the head? what I get by using UnityARMatrixOps.GetRotation(anchorData.transform) is the rotation of the head - relative to the position of the phone. If I move the phone, it affect the rotation.
    My use case is that I have a virtual face, and i want to move the face based on the user's face movements.
    I dont know if my question is clear enough... hope it is :)
     
  3. brendanluu

    brendanluu

    Joined:
    Jan 5, 2017
    Posts:
    7
    Hello!

    Is it possible to place an object on a plane that faces toward the camera when placed?

    Basically, I want to use an AR HitTest to place an object on a detected plane but instead of having it face forward according to initialization, face toward the camera when the touch for HitTest is registered on the screen.


    Add comment
     
  4. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    That's what FaceAnchor does - see FaceAnchorScene example - instead of the AxesPrefab, put your virtual face there and reference it from ARFaceAnchorManager
     
  5. tonOnWu

    tonOnWu

    Joined:
    Jun 23, 2017
    Posts:
    83
    Hi. I'm having this same problem right now. I am checking what you suggested. But I don't understand when you say "Your assets using said light". What do you mean with "said light"? Thanks.
     
  6. brendanluu

    brendanluu

    Joined:
    Jan 5, 2017
    Posts:
    7
    Figured it out - just added the code below to UnityARHitTestExample.cs after m_HitTransform.rotation

     m_HitTransform.transform.LookAt(Camera.main.transform.position);
    m_HitTransform.transform.eulerAngles = new Vector3(0, transform.eulerAngles.y, 0);
     
  7. stephencoorey

    stephencoorey

    Joined:
    Mar 29, 2017
    Posts:
    10
    Does anyone know how to get access to the ARCamera struct (ARCamera.cs in plugins/iOS/UnityARKit/NativeInterface)? It is used in ARFrame.cs, but I can't see either being used in UnityARSessionNativeInterface.cs at all.

    I want to use these 3 vectors in ARCamera.cs to calibrate my camera matrix for marker recognition in OpenCV/arucodetection:
    public Vector3 intrinsics_row1;
    public Vector3 intrinsics_row2;
    public Vector3 intrinsics_row3;

    Alternatively, has anyone done camera calibration in ARKit or ARCore?

    Update: I got the intrinsics on ARKit using this code: https://bitbucket.org/Unity-Technologies/unity-arkit-plugin/pull-requests/24/hotfix/diff

    I don't think ARCore exposes them yet, so I will debug on ARKit for now.
     
    Last edited: Mar 6, 2018
  8. demchukma

    demchukma

    Joined:
    Feb 3, 2017
    Posts:
    1
    I have a problem with detection the same plane again.
    I turn on AR and detect plane.
    Then turn it off with code:

    UnityARSessionNativeInterface.GetARSessionNativeInterface().Pause();

    Then I want to turn AR on and see the same behavior as at first start. So I do:

    ARKitWorldTrackingSessionConfiguration config = new ARKitWorldTrackingSessionConfiguration(
    UnityARAlignment.UnityARAlignmentGravity,
    UnityARPlaneDetection.Horizontal,
    false, true);
    UnityARSessionNativeInterface.GetARSessionNativeInterface().RunWithConfigAndOptions(
    config, UnityARSessionRunOption.ARSessionRunOptionResetTracking | UnityARSessionRunOption.ARSessionRunOptionRemoveExistingAnchors);

    AR works, it can detect new planes, but it doesn't detect plane at the same place where plane was at first start.
    What I am doing wrong?
     
  9. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,981
    @jimmya Nothing here or anywhere mentions how to get the actual centre of a plane anchor.

    When using the planeAnchor.centre vector, and using that to place my object, instead of appearing at the centre of the plane it appears at the corner (not even on the plane, just off it). It is the same each time so it thinks this is the centre.

    So how does one use the planeAnchor data to get the actual centre in world space of the plane? It would be nice to be able to loop through tracked planes, and use their actual world space centre to position an object, and its extent to determine the bounds so it can move around on the surface.


    I feel like working this out is the last thing stopping me from really using this properly. it would be great to be able to work out actual world space bounds + positions so I can make things walk around on surfaces.

    Any help will be greatly appreciated!


    EDIT: or should i be using the plane anchor game object rather than plane anchor to position object? If so are you getting the bounds of that simply by checking a collider or is there some XR.iOS function other than the matrix ops needed to get the bounds of a AR anchor bound gameobject?
     
  10. mantekkerz

    mantekkerz

    Joined:
    Dec 24, 2013
    Posts:
    111
    @Daemonhahn not sure if it's any help but to move characters around on generated surfaces I've been using Unity's run time nav mesh components (on GitHub), works pretty well so long as the area is big enough.
     
    MadeFromPolygons likes this.
  11. MadeFromPolygons

    MadeFromPolygons

    Joined:
    Oct 5, 2013
    Posts:
    3,981
    Thanks @MattMurphy , unfortunately I am fine with moving stuff in unity and am actually a proffessional VR/AR developer in my dayjob, its just the actual conversion between AR "space" and world space that seems to be going a bit wrong.

    Either way, I found out 4 different ways to do this (and not to do it) and posted my findings here for anyone that needs them:

    https://forum.unity.com/threads/pla...to-get-centre-in-world-space-of-plane.520721/
     
  12. christophergoy

    christophergoy

    Joined:
    Sep 16, 2015
    Posts:
    735
    What I meant is that you may need to take the ambient light estimate into account when shading your assets.
     
  13. rpasquini

    rpasquini

    Joined:
    Feb 1, 2018
    Posts:
    5
    Are the camera intrinsics still not exposed? ARKit has a way to get them- can it please be added to the plugin?
     
  14. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Read this first https://developer.apple.com/documentation/arkit/arplaneanchor/2882056-center?language=objc . Look at UnityARUtility.cs to see how the debug planes are created and placed. Notice the conversions from ARKit space to Unity space.
     
  15. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
  16. norahx3x

    norahx3x

    Joined:
    Dec 2, 2017
    Posts:
    5
    g
    did you find an answer ??
     
  17. rpasquini

    rpasquini

    Joined:
    Feb 1, 2018
    Posts:
    5
  18. erikwp

    erikwp

    Joined:
    Jan 16, 2018
    Posts:
    25
    Regarding the UnityARImageAnchor example - is there some reason reference images have to be square? How would I set a non-square image when ARReferenceImage size has only 1-dimension?
     
  19. Neadrim

    Neadrim

    Joined:
    Sep 26, 2017
    Posts:
    6
    There's a weird issue I have not been able to track down yet.

    Current Versions:
    - Unity 2017.3.1p1
    - UnityARKitPlugin 1.0.14
    - Any supported devices
    - Any OS from iOS 11.0 to 11.2.x so far
    * Note that it happened with previous versions of Unity and ARKit plugin as well.

    At a very low repro rate, the UnityARCamera get stuck on ARTrackingStateLimited + ARTrackingStateReasonInitializing.
    When that happens, it breaks ARKit for every apps on that device. They all become unable to start tracking and the only way to fix it is to restart the device.

    I didn't notice any specific pattern yet; It's not happening after anything unusual like a crash or errors.

    Does anyone else have seen this issue?

    Thanks.
     
  20. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    It infers the other dimension (larger) using the aspect ratio of your png or jpg file.
     
  21. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    This can happen due to a variety of reasons, and you can detect using OnTrackingChangeEvent - if it persists for a while (10s?), you should reset your session.
     
  22. Fl0oW

    Fl0oW

    Joined:
    Apr 24, 2017
    Posts:
    21
    I would like to get the ARPlaneAnchorGameObject of the plane I am currently looking at, but somehow the identifier of my hitResult doesn't seem to match up with any of the identifiers of the planes I have already detected. Is there an easy way to get the ARPlaneAnchorGameObject from one of the new irregular shaped planes through a hitTest? My current solution is just to find the closest anchor to the hitpoint, but I can think of plenty of situations where this wouldn't yield the result that I want. Thanks in advance for any ideas!
     
  23. rpasquini

    rpasquini

    Joined:
    Feb 1, 2018
    Posts:
    5
    Is there any way to get a native ARKit ARSession handle-- is the handle created via unity_CreateNativeARSession something that can be passed into native iOS code and used? Asking for a friend. :)
     
  24. Elliott-Mitchell

    Elliott-Mitchell

    Joined:
    Oct 8, 2015
    Posts:
    88
    I'm in the process of implementing ARKit 1.5 and have a dilemma: How do we support backward compatibility with devices that don't support ARKit 1.5?

    Do we get the iOS version of the device and add a ton of conditions in the UnityARCameraManager?

    Wondering how to deal with this given I have a popular AR game on the App Store and I don't want to force my players to iOS 11.3

    Thanks!
     
    Last edited: Mar 15, 2018
  25. ZhengzhongSun

    ZhengzhongSun

    Joined:
    Oct 12, 2016
    Posts:
    20
    I found a problem of the unity arkit sdk recently. It seems in the newer ios version since ios 11.2, but I didn't test it in every ios version. The test results are as following.
    1. I build an arkit app by using unity, it can run well.
    2. But when I connect my iphone to the monitor with hdmi and start the arkit app, It will crash directly.
    3. When I open the app in advance or put it in the background, and then connect it to monitor, it won't crash.
    4. The arkit app by using the ios native arkit doesn't have this problem.
    How to solve this problem?
    Any advice will be appreciated. Thank you very much!
     
  26. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    We are working on a solution to this - please bear with us.
     
    Elliott-Mitchell likes this.
  27. Elliott-Mitchell

    Elliott-Mitchell

    Joined:
    Oct 8, 2015
    Posts:
    88
    Thanks Jimmya!
     
  28. kushG

    kushG

    Joined:
    May 25, 2017
    Posts:
    22
    I'm seeing some problems with the texture I get from ArKit as compared to Unity's Default WebCamTexture. ArKit's texture appear to be little blurry and less sharp. If you watch the two images below carefully you'll notice the difference. I'm using the `UnityARVideo` script to render the webcam texture for ArKit without any changes to the script. So no changes there.
     

    Attached Files:

    Last edited: Mar 19, 2018
  29. HStuart

    HStuart

    Joined:
    Jul 30, 2015
    Posts:
    7
    Hi guys,

    I am relatively new to this AR stuff, quite familair with unity but have been struggling with something. There are all these awesome tutorials on plane detection and placing objects on planes etc. I was wondering, in terms of "anchors", is it possible to "create" and anchor when the player touches the screen at that position. For example, the players phone is looking at anything that is not horizontal or vertical (if they were, plane detection would be easiest), and when they tap on a real world object, an anchor is produced at that position and a gameobject is created at that anchor.

    If this is possible, how do i go about creating the anchor and placing a gameobject at that anchor? I have found the available resources not descriptive or detailed enough.

    Thank, Harry
     
  30. TheFullMetalAlex

    TheFullMetalAlex

    Joined:
    Jul 20, 2017
    Posts:
    1
    Does anyone know why I'd be getting this error when attempting to use the ARKit Remote?


    SerializationException: Unable to find assembly 'Assembly-CSharp, Version=0.0.0.0, Culture=, PublicKeyToken=null'.
    System.Runtime.Serialization.Formatters.Binary.BinaryAssemblyInfo.GetAssembly () (at /Users/builduser/buildslave/mono/build/mcs/class/referencesource/mscorlib/system/runtime/serialization/formatters/binary/binarycommonclasses.cs:408)
    System.Runtime.Serialization.Formatters.Binary.ObjectReader.GetType (System.Runtime.Serialization.Formatters.Binary.BinaryAssemblyInfo assemblyInfo, System.String name) (at /Users/builduser/buildslave/mono/build/mcs/class/referencesource/mscorlib/system/runtime/serialization/formatters/binary/binaryobjectreader.cs:1489)
    System.Runtime.Serialization.Formatters.Binary.ObjectMap..ctor (System.String objectName, System.String[] memberNames, System.Runtime.Serialization.Formatters.Binary.BinaryTypeEnum[] binaryTypeEnumA, System.Object[] typeInformationA, System.Int32[] memberAssemIds, System.Runtime.Serialization.Formatters.Binary.ObjectReader objectReader, System.Int32 objectId, System.Runtime.Serialization.Formatters.Binary.BinaryAssemblyInfo assemblyInfo, System.Runtime.Serialization.Formatters.Binary.SizedArray assemIdToAssemblyTable) (at /Users/builduser/buildslave/mono/build/mcs/class/referencesource/mscorlib/system/runtime/serialization/formatters/binary/binarycommonclasses.cs:2153)
    System.Runtime.Serialization.Formatters.Binary.ObjectMap.Create (System.String name, System.String[] memberNames, System.Runtime.Serialization.Formatters.Binary.BinaryTypeEnum[] binaryTypeEnumA, System.Object[] typeInformationA, System.Int32[] memberAssemIds, System.Runtime.Serialization.Formatters.Binary.ObjectReader objectReader, System.Int32 objectId, System.Runtime.Serialization.Formatters.Binary.BinaryAssemblyInfo assemblyInfo, System.Runtime.Serialization.Formatters.Binary.SizedArray assemIdToAssemblyTable) (at /Users/builduser/buildslave/mono/build/mcs/class/referencesource/mscorlib/system/runtime/serialization/formatters/binary/binarycommonclasses.cs:2204)
    System.Runtime.Serialization.Formatters.Binary.__BinaryParser.ReadObjectWithMapTyped (System.Runtime.Serialization.Formatters.Binary.BinaryObjectWithMapTyped record) (at /Users/builduser/buildslave/mono/build/mcs/class/referencesource/mscorlib/system/runtime/serialization/formatters/binary/binaryparser.cs:683)
    System.Runtime.Serialization.Formatters.Binary.__BinaryParser.ReadObjectWithMapTyped (System.Runtime.Serialization.Formatters.Binary.BinaryHeaderEnum binaryHeaderEnum) (at /Users/builduser/buildslave/mono/build/mcs/class/referencesource/mscorlib/system/runtime/serialization/formatters/binary/binaryparser.cs:656)
    System.Runtime.Serialization.Formatters.Binary.__BinaryParser.Run () (at /Users/builduser/buildslave/mono/build/mcs/class/referencesource/mscorlib/system/runtime/serialization/formatters/binary/binaryparser.cs:146)
    System.Runtime.Serialization.Formatters.Binary.ObjectReader.Deserialize (System.Runtime.Remoting.Messaging.HeaderHandler handler, System.Runtime.Serialization.Formatters.Binary.__BinaryParser serParser, System.Boolean fCheck, System.Boolean isCrossAppDomain, System.Runtime.Remoting.Messaging.IMethodCallMessage methodCallMessage) (at /Users/builduser/buildslave/mono/build/mcs/class/referencesource/mscorlib/system/runtime/serialization/formatters/binary/binaryobjectreader.cs:174)
    System.Runtime.Serialization.Formatters.Binary.BinaryFormatter.Deserialize (System.IO.Stream serializationStream, System.Runtime.Remoting.Messaging.HeaderHandler handler, System.Boolean fCheck, System.Boolean isCrossAppDomain, System.Runtime.Remoting.Messaging.IMethodCallMessage methodCallMessage) (at /Users/builduser/buildslave/mono/build/mcs/class/referencesource/mscorlib/system/runtime/serialization/formatters/binary/binaryformatter.cs:197)
    System.Runtime.Serialization.Formatters.Binary.BinaryFormatter.Deserialize (System.IO.Stream serializationStream, System.Runtime.Remoting.Messaging.HeaderHandler handler, System.Boolean fCheck, System.Runtime.Remoting.Messaging.IMethodCallMessage methodCallMessage) (at /Users/builduser/buildslave/mono/build/mcs/class/referencesource/mscorlib/system/runtime/serialization/formatters/binary/binaryformatter.cs:173)
    System.Runtime.Serialization.Formatters.Binary.BinaryFormatter.Deserialize (System.IO.Stream serializationStream, System.Runtime.Remoting.Messaging.HeaderHandler handler, System.Boolean fCheck) (at /Users/builduser/buildslave/mono/build/mcs/class/referencesource/mscorlib/system/runtime/serialization/formatters/binary/binaryformatter.cs:118)
    System.Runtime.Serialization.Formatters.Binary.BinaryFormatter.Deserialize (System.IO.Stream serializationStream, System.Runtime.Remoting.Messaging.HeaderHandler handler) (at /Users/builduser/buildslave/mono/build/mcs/class/referencesource/mscorlib/system/runtime/serialization/formatters/binary/binaryformatter.cs:149)
    System.Runtime.Serialization.Formatters.Binary.BinaryFormatter.Deserialize (System.IO.Stream serializationStream) (at /Users/builduser/buildslave/mono/build/mcs/class/referencesource/mscorlib/system/runtime/serialization/formatters/binary/binaryformatter.cs:111)
    Utils.ObjectSerializationExtension.Deserialize[T] (System.Byte[] byteArray) (at Assets/UnityARKitPlugin/ARKitRemote/ObjectSerializationExtension.cs:39)
    UnityEngine.XR.iOS.ARKitRemoteConnection.UpdateCameraFrame (UnityEngine.Networking.PlayerConnection.MessageEventArgs mea) (at Assets/UnityARKitPlugin/ARKitRemote/ARKitRemoteConnection.cs:124)
    UnityEngine.Events.InvokableCall`1[T1].Invoke (T1 args0) (at /Users/builduser/buildslave/unity/build/Runtime/Export/UnityEvent.cs:206)
    UnityEngine.Events.UnityEvent`1[T0].Invoke (T0 arg0) (at /Users/builduser/buildslave/unity/build/Runtime/Export/UnityEvent_1.cs:58)
    UnityEngine.Networking.PlayerConnection.PlayerEditorConnectionEvents.InvokeMessageIdSubscribers (System.Guid messageId, System.Byte[] data, System.Int32 playerId) (at /Users/builduser/buildslave/unity/build/Runtime/Export/Networking/PlayerConnection/PlayerEditorConnectionEvents.cs:66)
    UnityEditor.Networking.PlayerConnection.EditorConnection.MessageCallbackInternal (System.IntPtr data, System.UInt64 size, System.UInt64 guid, System.String messageId) (at /Users/builduser/buildslave/unity/build/Editor/Mono/Networking/PlayerConnection/EditorConnection.cs:164)

    I get this error every frame when using the remote. My iPhone X's camera will still enable in the app, but the editor doesn't seem to recognize it at all (camera pos/rot is not updated and game view doesn't show my phone camera perspective, just a green screen)

    It's been persistent for months, stopping me from doing anything through the ARKIT remote app at all and slowing down dev a lot. I tried rebuilding the remote scene, making sure my Unity editor version matched the version I originally built on, tried switching between XCode 9.2 stable and XCode 9.3 beta, tried both old and completely new scenes for the remote connection, made sure my build settings were set to debug/development build. I'm kind of out of ideas at this point.

    I'm using Unity 2017.3.0f3, but this problem has persisted from 2017.2 when I first began the project. My iPhone is on the 11.3 beta and is up-to-date. I also made sure I was a trusted developer and that the remote app has camera permissions on my phone. Any thoughts? I've seen others have this issue and tried some fixes (my list of stuff above), but it doesn't seem like they work consistently for everyone. I'm not sure if Unity's remote app is just very unstable or if I'm doing something wrong.

    UPDATE: I've tried multiple macs on High Sierra and Sierra but that hasn't made a difference either
     

    Attached Files:

    Last edited: Mar 20, 2018
  31. inihility

    inihility

    Joined:
    Oct 17, 2014
    Posts:
    7
    conradHink likes this.
  32. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    This is the default resolution of the image used by ARKit, which might be lower than the camera resolution. In ARKit 1.5, they have increased the resolution by default, and allow the developer to change to other video formats if available.
     
  33. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    The "SerializationException" means that the format of the data you're serializing on end is not being recognized on the other end. (the ends being editor and remote device). Most likely because you're using a version of the remote app built with a different version of Unity than the one you have for the Editor.
     
  34. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
     
  35. jameskanestl

    jameskanestl

    Joined:
    Nov 3, 2017
    Posts:
    12
    I have a question re: Unity/ARKit as someone who is not familiar w/ iOS/Swift dev at all.

    I need a bluetooth peripheral to work with my ARKit experience, and I want to build the experience in Unity. The bluetooth device has both a Unity SDK & an iOS SDK, but, ya know, they're different... If I import the bluetooth peripheral's Unity SDK into the Unity project, along with the ARKit plugin, can I expect the peripheral to work as expected once exported to the Xcode project & then built to the device? Part of me feels like this is a fool's hope, but I'm going to be trying in a few days. Any tips would be helpful! Thanks, all.
     
    Last edited: Mar 21, 2018
  36. ftarnogol

    ftarnogol

    Joined:
    Feb 2, 2013
    Posts:
    5
    Has anyone found a way to update a pre 1.5 project to the new SDK? I am having a tough time. I've tried deleting the old plugin, re-arranging files on the plugin folder with the previous project structure but I keep getting console errors. I'd hate having to redo the whole project just to implement 1.5 or waiting for the asset to be updated on the app store since Apple is pressing us to update before WWDC
     
  37. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    It should update pretty cleanly - seems like you might be doing something wrong. Are you writing your code within the plugin folders? You should probably not do that.
     
  38. ftarnogol

    ftarnogol

    Joined:
    Feb 2, 2013
    Posts:
    5
    Hey jimmya, you mean I should just drop the UnityArkitPlugin folder into the Assets folder? I was dropping the folder within Plugins/iOS/

    If I place the bitbucket code into the assets folder (previously deleted the old version of the UnityArkitPlugin folder) I get the following console errors. Still better than the "asset"notFound errors I was getting previously
     

    Attached Files:

  39. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
  40. ftarnogol

    ftarnogol

    Joined:
    Feb 2, 2013
    Posts:
    5
    Thank you! I deleted the old one before importing. I'll go through the project to see if I have any duplicates creating conflicts :)

    Update: Solved. there was an ARAmbient.cs file on the Assets root creating the conflict (don't know why that file was there astray. Thanks for the bearings.
     
    Last edited: Mar 21, 2018
  41. kushG

    kushG

    Joined:
    May 25, 2017
    Posts:
    22
    Thanks Jimmya for your response. I tried ArKit 1.5 with the maximum resolution possible but it still appears a little blurry as compared to the camera app on the phone. Seems like ArKit adds some kind of noise/processing to the WebcamTexture. I know this is not something Unity has control on but I'd appreciate if you have any kind of information on what processing is being applied so that I can try adding a post processing method to remove arkit's processing or if there is a way to get the raw webcamTexture when I click a button. Basically I need the texture for some Image processing experiments that we are working on in our company but our networks are getting confused with ArKit textures because of the blurriness.

    Attached are the images taken with ArKit 1.5 and regular camera app
     

    Attached Files:

  42. ccklokwerks

    ccklokwerks

    Joined:
    Oct 21, 2014
    Posts:
    62
    You want to make sure that the Unity SDK for the bluetooth device supports iOS. Beyond that, no clue here, sorry.
     
  43. ccklokwerks

    ccklokwerks

    Joined:
    Oct 21, 2014
    Posts:
    62
    I feel very very stupid, but I can't any of the demos to work. Where am I going wrong?

    I downloaded the plugin via the repository after the version on the Asset Store was giving me some message spam. However, to give an example - I tried to build the AddRemoveAnchorExample (if there is supposed to be a central first scene I am unclear on which one it should be?). Builds to XCode fine. Run via debugger to iPhone 6s, and I just see a black screen with what appears to be the word "MAKE" in tiny letters in the upper right(!?). Debug log:

    Initializing Metal device caps: Apple A8 GPU

    Initialize engine version: 2017.3.1f1 (fc1d3344e6ea)

    WARNING: Shader Unsupported: 'Hidden/BlitToDepth' - Pass '' has no vertex shader

    WARNING: Shader Unsupported: 'Hidden/BlitToDepth' - Setting to default shader.

    WARNING: Shader Unsupported: 'Hidden/BlitToDepth_MSAA' - Pass '' has no vertex shader

    WARNING: Shader Unsupported: 'Hidden/BlitToDepth_MSAA' - Setting to default shader.

    Setting up 1 worker threads for Enlighten.

    Thread -> id: 170567000 -> priority: 1

    UnloadTime: 7.868625 ms

    2018-03-21 16:49:21.677021-0400 arkittest[465:75630] [BoringSSL] Function boringssl_session_errorlog: line 2871 [boringssl_session_read] SSL_ERROR_ZERO_RETURN(6): operation failed because the connection was cleanly shut down with a close_notify alert

    2018-03-21 16:49:21.677374-0400 arkittest[465:75630] [BoringSSL] Function boringssl_session_errorlog: line 2871 [boringssl_session_read] SSL_ERROR_ZERO_RETURN(6): operation failed because the connection was cleanly shut down with a close_notify alert

    2018-03-21 16:49:21.677721-0400 arkittest[465:75630] [BoringSSL] Function boringssl_session_errorlog: line 2871 [boringssl_session_read] SSL_ERROR_ZERO_RETURN(6): operation failed because the connection was cleanly shut down with a close_notify alert​

    And that's it!

    This was using Unity 2017.1.3, XCode 9.2, and an iPhone 6s running 11.2.6.
     
  44. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    This should give you a clue: "Initializing Metal device caps: Apple A8 GPU"
    This means you do not have a iphone6s - that has an A9 GPU. It might be an iphone6.
     
  45. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
  46. ccklokwerks

    ccklokwerks

    Joined:
    Oct 21, 2014
    Posts:
    62
    Thank you, Jimmya. (It's an employer-provided phone...)
     
  47. g-augview

    g-augview

    Joined:
    Mar 28, 2017
    Posts:
    13
    hi all

    I am using unity2017.3; Vuforia for devices without ARKit, but Unity's ARKit plugin otherwise. My code works with Vuforia but I can't make it work with Unity's plugin (first image is desired effect: I see the assets underground, 2nd is with arkit, I can't see assets underground).
    correct_effect.png wrong_effect.png
    I have 2 cameras in my scene, one for the video only and one for the effect. I am trying to render an effect with 2 planes on top of the video feed. The video camera has a depth of -1 and the main camera 1 (this is the same for vuforia and unity's plugin).


    Here is how I render the effect, which I call in an Update():



    public void RenderVideoClipping(Texture videoTexture)
    {
    //Layers definition:
    //8 - Over
    //9 - Under
    //10- Over Under
    //11- Clip
    //12- Over Clip
    //13- Under Clip
    //14- Over Under Clip

    videoMaterial.mainTexture = videoTexture;

    /****************Render the clip geometries to the clip texture*************/

    //Clear the current frame
    RenderTexture.active = clipTexture;
    Camera.main.targetTexture = clipTexture;

    Camera.main.backgroundColor = new Color(UndergroundOpacity, UndergroundOpacity, UndergroundOpacity, UndergroundOpacity);
    GL.Clear(false, true, Camera.main.backgroundColor);

    //Render the clip geometries
    Camera.main.cullingMask = (1 << 11) | (1 << 12) | (1 << 13) | (1 << 14);
    Camera.main.RenderWithShader(AVResources.Instance.TransparentWhiteShader, "");
    Camera.main.ResetReplacementShader();

    /****************************/

    /******Render the clipped video***********/
    //This is done by rendering the clip geometry transparently but with depth.
    //Then the video plane is rendered at 1000m in the background.
    //The areas where the transparent clip geometry was rendered will prevent the video from rendering because of the z depth test.

    //Render the video plane
    RenderTexture.active = undergroundTexture;
    Camera.main.targetTexture = undergroundTexture;

    Camera.main.backgroundColor = Color.black;
    GL.Clear(true, true, Camera.main.backgroundColor);

    //Configure the underground clipping plane

    //Render the clip geometry transparently but with depth
    Camera.main.cullingMask = (1 << 11) | (1 << 12) | (1 << 13) | (1 << 14);
    Camera.main.RenderWithShader(AVResources.Instance.TransparentBlackDepthShader, "");
    Camera.main.ResetReplacementShader();

    //Render the video
    Camera.main.clearFlags = CameraClearFlags.Nothing;
    Camera.main.cullingMask = (1 << 31);
    Camera.main.Render();
    /**********************/

    /********** Render the underground geometries on top of the clipped video. **********/
    Camera.main.depthTextureMode = DepthTextureMode.None;

    //Clear the depth buffer
    Camera.main.clearFlags = CameraClearFlags.Depth;
    GL.Clear(true, false, Camera.main.backgroundColor);

    //Render the underground geometries
    Camera.main.cullingMask = (1 << 9) | (1 << 10) | (1 << 13) | (1 << 14);
    Camera.main.Render();

    /********************/

    /********** Render the video plane, then the underground objects plane then the overground objects. **********/
    //The render order is handled in shaders.
    RenderTexture.active = null;
    Camera.main.targetTexture = null;

    //Clear the current frame
    Camera.main.clearFlags = CameraClearFlags.Depth;
    /********************/

    //Render the overground geometries and rendering planes (layers 30,31)
    Camera.main.cullingMask = (1 << 8) | (1 << 10) | (1 << 12) | (1 << 14) | ((1 << 30) | (1 << 31));
    }



    This is how I get the vuforia video texture (that I feed to the RenderVideoClipping method on Update()):


    public override Texture GetVideoTexture()
    {
    if (videoTexture == null || videoTexture.width != Screen.width || videoTexture.height != Screen.height)
    {
    GameObject.Destroy(videoTexture);
    videoTexture = new RenderTexture(Screen.width, Screen.height, 32);
    videoTexture.useMipMap = false;
    videoTexture.filterMode = FilterMode.Point;
    }

    if (VuforiaRenderer.Instance.IsVideoBackgroundInfoAvailable())
    {
    // render videotexture into video plane
    RenderTexture.active = videoTexture;
    Camera vuforiaCamera = GetCameraParent().GetComponentInChildren<Camera>();
    vuforiaCamera.targetTexture = videoTexture;

    vuforiaCamera.backgroundColor = Color.black;
    GL.Clear(true, true, vuforiaCamera.backgroundColor);

    vuforiaCamera.clearFlags = CameraClearFlags.Nothing;
    vuforiaCamera.Render();
    vuforiaCamera.targetTexture = null;
    }

    return videoTexture;
    }


    And here is how I get the videoTexture with the arkit plugin ARKit:
    I simply slightly modified the UnityARVideo behaviour by adding a CommandBuffer to blit into a RenderTexture, and I pass that render texture to RenderVideoClipping() above.


    void InitializeCommandBuffer()
    {
    m_VideoCommandBuffer1 = new CommandBuffer();
    m_VideoCommandBuffer1.Blit(null, VideoTexture, m_ClearMaterial);
    GetComponent<Camera>().AddCommandBuffer(CameraEvent.BeforeForwardOpaque, m_VideoCommandBuffer1);

    m_VideoCommandBuffer2 = new CommandBuffer();
    m_VideoCommandBuffer2.Blit(null, BuiltinRenderTextureType.CameraTarget, m_ClearMaterial);
    GetComponent<Camera>().AddCommandBuffer(CameraEvent.BeforeForwardOpaque, m_VideoCommandBuffer2);

    bCommandBufferInitialized = true;
    }

    I can't understand why I can't see the effect planes in the case of arkit; they are visible in the scene tab but my camera does not render them (I checked the culling mask, it is correct).
    thanks
     
  48. ina

    ina

    Joined:
    Nov 15, 2010
    Posts:
    1,085
    Just checking back again... FaceRemoved still not triggering in ARKitFace* demos - in the latest March 23 release on bitbucket

    UnityARSessionNativeInterface.ARFaceAnchorRemovedEvent += FaceRemoved;

    Any update / insight here
     
  49. Play_Edu

    Play_Edu

    Joined:
    Jun 10, 2012
    Posts:
    722
    Hi, There is any way to detect user finger overlapping physical camera of device in ARKit?
     
  50. Kevin__h

    Kevin__h

    Joined:
    Nov 16, 2015
    Posts:
    38
    Hi, I ran into a "small" problem. It's somewhat fixeable by making a lot of itterations till I get it correct but that's not how it's supposed to work. When I use the ImageAnchor from the 1.5 version my prebab spawn in a weird angle. Sometimes it spawns right but that takes a lot of updates, also 180 degrees in a Unity project seems to be around 30~45 degrees in real life once the app is build? It's really weird. And both Unity and ARkit use 1 unit for 1 meter but I'm trying to map a building of 20x20m over the real life version, setting the scale to 20 is still too small, even when my real life tracker is bigger than the given physical size in the build? Anyone else who has some experience with this?
     
Thread Status:
Not open for further replies.