Search Unity

Resolved ARKit support for iOS via Unity-ARKit-Plugin

Discussion in 'AR' started by jimmya, Jun 5, 2017.

Thread Status:
Not open for further replies.
  1. HulloImJay

    HulloImJay

    Joined:
    Mar 26, 2012
    Posts:
    89
    I have not! However, I did manage to add an offset. I simply had to modify the helpers to accomodate it. I guess it would be better for me to use them only as reference and write my own scripts which do offsetting as needed...


    I do have another question: Will this SDK be included in future versions of Unity with an integrated API for ARKit and ARCore, etc.? In the spirit of "build once, deploy everywhere"? And if so, is there a roadmap for such a thing?
     
  2. ph_

    ph_

    Joined:
    Sep 5, 2013
    Posts:
    232
    In IOS 11.0 and with some dual camera devices (namely the iphone 7S), Apple exposed access to the camera generated depth texture in its API (https://developer.apple.com/documentation/avfoundation/avdepthdata)

    Do you know when this will be accessible in Unity ? (even in beta ...etc).
    It would be interesting to use this depth buffer to help with AR occlusion.

    Thanks !
     
  3. christophergoy

    christophergoy

    Joined:
    Sep 16, 2015
    Posts:
    735
    update your plugin
     
  4. christophergoy

    christophergoy

    Joined:
    Sep 16, 2015
    Posts:
    735
    This is not part of ARKit and i'm not sure if they work together. Have you tried it yourself at all with ARKit running? There is no roadmap for this in unity at the moment.
     
  5. Inboxninja

    Inboxninja

    Joined:
    Feb 19, 2014
    Posts:
    14
    It's looking pretty nice so far... and its lots of fun as well. Two minor gripes:

    1) unityARAnchorManager in the UnityARGeneratePlane class is now private... that was a handy accessor to determine if you had any playing fields detected yet. Not a show stopper but having it public was nice. I guess I'll just do something on the plane prefab itself.

    2) when updating the package from the asset store it nukes the project settings. I get why it wants to do this, but it's still a pain to have to reset all the things. Is there anyway it can maybe only make the changes it needs rather than resetting everything? Or did I miss an import option to import everything but the project settings?


    p.s. love your work!
     
  6. pixelsteam

    pixelsteam

    Joined:
    May 1, 2009
    Posts:
    924
    Hi Chris, I was able to solve it. It basically is a push back that will not allow a beta to be put onto the app store.
     
  7. DavidErosa

    DavidErosa

    Joined:
    Aug 30, 2012
    Posts:
    41
    Thanks!, That seems to be the problem I faced some time ago on this same thread
     
  8. wrutkowski

    wrutkowski

    Joined:
    Aug 9, 2017
    Posts:
    8
    Hi, question on a terrain occlusion. I applied MobileOcclusion shader to a material and the material to a cube to cut bridge prefab and sample cube I was testing with (as seen on the screen shot).
    I have a problem with terrain occlusion though. I tried DepthMask (http://wiki.unity3d.com/index.php?title=DepthMask) but it doesn't work with ARKit - terrain and objects are not cut.
    And directions how to solve it much appreciated. My idea is to simulate the "water" with real world surfaces.
     

    Attached Files:

  9. Elliott-Mitchell

    Elliott-Mitchell

    Joined:
    Oct 8, 2015
    Posts:
    88
    I was under the assumption that ScaledContentExample branch logic used camera trickery to render large assets at a smaller scale. It seems that the ScaledContentExample actually scales both the target object(s) & camera parent to achieve the goal of bringing large assets into a scene at at smaller scale?
     
  10. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    1. AFAIK unityARAnchorManager has always been private. You could make your own copy of UnityARGeneratePlane, and add this method to it:

    public List<ARPlaneAnchorGameObject> GetCurrentPlaneAnchors()
    {
    return unityARAnchorManager.GetCurrentPlaneAnchors();
    }

    2. When importing the package into your project, you can "uncheck" whichever file you don't want to update. You may have problems with some examples as we may depend on some settings for them.
     
  11. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    I haven't worked too closely with it - @christophergoy is your man, but the way I understood it, it is supposed to solve the problem that you cannot put a scale on the root of your target object(s) (or physics and other stuff gets messed up), so he is doing some camera tricks on the second camera to achieve the effect.
     
    Elliott-Mitchell likes this.
  12. Elliott-Mitchell

    Elliott-Mitchell

    Joined:
    Oct 8, 2015
    Posts:
    88
    OK, I can't scale a lot of these assets because of physics and a host of other reasons. Looking forward to hearing from @christophergoy!

    Camera tricks == AwesomeSauce! I'll keep plugging away for a little while longer. I'll need to stop soon and make a trailer due tomorrow night. If I can' get this working by then, I'd do some trickery in the trailer to give the illusion I'm after in real game play.

    Thanks
    -Elliott
     
  13. wrutkowski

    wrutkowski

    Joined:
    Aug 9, 2017
    Posts:
    8
    I've solved the same problem in my game. I attached the camera to parent object and I'm changing the scale of the parent object (camera rig) to adjust "scale" of the world. All physics, particles and terrain are working correctly in that solution as their scale is untouched.
     
    Elliott-Mitchell likes this.
  14. Elliott-Mitchell

    Elliott-Mitchell

    Joined:
    Oct 8, 2015
    Posts:
    88
    That’s interesting, I’m struggling to get this approach to work. Maybe I’m missing something?

    Are you the scaling camera parent via local scale? I’m away from my computer for a couple hours, but what I was doing was scaling the camera parent via transform gizmo. Perhaps that’s the issue?

    Thanks for chiming in. I’m probably doing something stupid, due to being exhausted, I hope :p
     
  15. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Did you not get the scaled content branch working for you?
     
  16. Elliott-Mitchell

    Elliott-Mitchell

    Joined:
    Oct 8, 2015
    Posts:
    88
    No, not in my specific use case. I don’t want to instanciate the scaled object based on a hit in the scene. I couldn’t get it to work at all in my project.

    To be fair, it’s a older, complex project that is not straight forward.

    Basically, I want to just have the camera scale the objects when the game starts.
     
  17. Elliott-Mitchell

    Elliott-Mitchell

    Joined:
    Oct 8, 2015
    Posts:
    88
     
  18. Elliott-Mitchell

    Elliott-Mitchell

    Joined:
    Oct 8, 2015
    Posts:
    88
    I also locally merged changes in the default branch with the scaled camera branch. Maybe that’s problematic too?
     
  19. wrutkowski

    wrutkowski

    Joined:
    Aug 9, 2017
    Posts:
    8
    I used local scale of the parent object.
     
  20. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Check the changes that were done in the Scaled Content branch - the main ones are the two cameras, the scale given on the second camera, and the two tags used: one for ARKitTrackingData and the other for ScaledContent, and make the top level of your objects have the ScaledContent tag.
     
    Elliott-Mitchell likes this.
  21. MaxXR

    MaxXR

    Joined:
    Jun 18, 2017
    Posts:
    67
    hey has anyone got this working?

    My goal is – arkit tap on screen starts a tilt brush animation when i click on the mapped plane. Thus far I'm just trying to get ARKit remote to work. Doesn't seem to be responding properly (see attached).

    Using 2017.2.05b + todays version of ARKit Plugin. Have build both ARKitScene and ARKitRemote (selected Developer checkbox) and neither work in real time with the editor. Just gets stuck on green in Unity. As for the ipad pro app – it conjured the box on the table (but doesn't update an conjure a sphere if i change it (and port over the scripts on the HitTest cube).

    Any ideas how to resolve & get it working? Thanks
     

    Attached Files:

  22. ThatOdieGuy

    ThatOdieGuy

    Joined:
    Mar 13, 2016
    Posts:
    7
    I've had some issues with the ARKitRemote. It has difficulties connecting sometimes. I typically kill the app on the phone, stop the project in Unity. Then I relaunch the app on my phone. Re-launch the app in unity. Then go to the console, select my phone. Then hit the start button in the scene. And sometimes I need to repeat this process a few times.
     
    KwahuNashoba likes this.
  23. ThatOdieGuy

    ThatOdieGuy

    Joined:
    Mar 13, 2016
    Posts:
    7
    What's the status of the scaled branch? Is that something that's intended to be merged into the master branch?

    And thank you for this plugin! It has been exciting to play with. When a cube in my project randomly ended up in my kitchen, this story became way too real:
    https://qntm.org/responsibility
     
    christophergoy likes this.
  24. Gharry

    Gharry

    Joined:
    Jan 7, 2016
    Posts:
    3
    Hey, so I've been delaying updating everything for while since I ran into some problems the last time.

    If I were to update everything today (XCode, Unity, and ARKit Plugin), should everything run smoothly? Or would I need to wait for another plugin update before I'm able to get my project to run?

    Edit: I went ahead and updated everything. So far everything seems to be working.
     
    Last edited: Sep 10, 2017
    HaimBendanan likes this.
  25. HaimBendanan

    HaimBendanan

    Joined:
    May 10, 2016
    Posts:
    28
    Hey, how is the deployment to the app store gonna be? Can I post my app using the arkit plugin and expect it to work with iOS 11 (non-beta)? Or should we wait for an updated version of the plugin?
     
  26. TobiasW

    TobiasW

    Joined:
    Jun 18, 2011
    Posts:
    91
    Hey, I have the same problem. Did you manage to fix it?
     
  27. showmevirtual

    showmevirtual

    Joined:
    Sep 10, 2017
    Posts:
    3
    Getting this error using iOS project builder, Unity 2017.1.1f1, and xcode 9 beta 6. Any ideas what I'm missing? Thank you.


    Libraries/Plugins/iOS/UnityARKit/NativeInterface/ARSessionNative.mm(217,134): error: unknown type name 'ARWorldTrackingConfiguration'; did you mean 'ARWorldTrackingSessionConfiguration'?
    inline void GetARSessionConfigurationFromARKitWorldTrackingSessionConfiguration(ARKitWorldTrackingSessionConfiguration& unityConfig, ARWorldTrackingConfiguration* appleConfig)
    ^~~~~~~~~~~~~~~~~~~~~~~~~~~~
    ARWorldTrackingSessionConfiguration
    C:/Users/ShowMeVirtual/iOS Project Builder for Unity/SDK/System/Library/Frameworks/ARKit.framework/Headers/ARSessionConfiguration.h(78,12): note: 'ARWorldTrackingSessionConfiguration' declared here
    @interface ARWorldTrackingSessionConfiguration : ARSessionConfiguration
    ^
    Libraries/Plugins/iOS/UnityARKit/NativeInterface/ARSessionNative.mm(224,108): error: unknown type name 'ARConfiguration'
    inline void GetARSessionConfigurationFromARKitSessionConfiguration(ARKitSessionConfiguration& unityConfig, ARConfiguration* appleConfig)
    ^
    Libraries/Plugins/iOS/UnityARKit/NativeInterface/ARSessionNative.mm(261,48): error: no visible @interface for 'ARCamera' declares the selector 'projectionMatrixForOrientation:viewportSize:zNear:zFar:'
    matrix_float4x4 projectionMatrix = [camera projectionMatrixForOrientation:[[UIApplication sharedApplication] statusBarOrientation] viewportSize:nativeSize zNear:(CGFloat)unityCameraNearZ zFar:(CGFloat)unityCameraFarZ];
    ~~~~~~ ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    Libraries/Plugins/iOS/UnityARKit/NativeInterface/ARSessionNative.mm(423,35): error: no visible @interface for 'ARFrame' declares the selector 'displayTransformForOrientation:viewportSize:'
    s_CurAffineTransform = [frame displayTransformForOrientation:[[UIApplication sharedApplication] statusBarOrientation] viewportSize:nativeSize];
    ~~~~~ ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    Libraries/Plugins/iOS/UnityARKit/NativeInterface/ARSessionNative.mm(441,81): error: property 'ambientColorTemperature' not found on object of type 'ARLightEstimate *'
    unityARCamera.lightEstimation.ambientColorTemperature = frame.lightEstimate.ambientColorTemperature;
    ^
    Libraries/Plugins/iOS/UnityARKit/NativeInterface/ARSessionNative.mm(694,5): error: unknown type name 'ARWorldTrackingConfiguration'
    ARWorldTrackingConfiguration* config = [ARWorldTrackingConfiguration new];
    ^
    Libraries/Plugins/iOS/UnityARKit/NativeInterface/ARSessionNative.mm(694,45): error: use of undeclared identifier 'ARWorldTrackingConfiguration'
    ARWorldTrackingConfiguration* config = [ARWorldTrackingConfiguration new];
    ^
    Libraries/Plugins/iOS/UnityARKit/NativeInterface/ARSessionNative.mm(707,5): error: unknown type name 'ARWorldTrackingConfiguration'
    ARWorldTrackingConfiguration* config = [ARWorldTrackingConfiguration new];
    ^
    Libraries/Plugins/iOS/UnityARKit/NativeInterface/ARSessionNative.mm(707,45): error: use of undeclared identifier 'ARWorldTrackingConfiguration'
    ARWorldTrackingConfiguration* config = [ARWorldTrackingConfiguration new];
    ^
    Libraries/Plugins/iOS/UnityARKit/NativeInterface/ARSessionNative.mm(717,5): error: unknown type name 'ARConfiguration'
    ARConfiguration* config = [AROrientationTrackingConfiguration new];
    ^
    Libraries/Plugins/iOS/UnityARKit/NativeInterface/ARSessionNative.mm(717,32): error: use of undeclared identifier 'AROrientationTrackingConfiguration'
    ARConfiguration* config = [AROrientationTrackingConfiguration new];
    ^
    Libraries/Plugins/iOS/UnityARKit/NativeInterface/ARSessionNative.mm(730,5): error: unknown type name 'ARConfiguration'
    ARConfiguration* config = [AROrientationTrackingConfiguration new];
    ^
    Libraries/Plugins/iOS/UnityARKit/NativeInterface/ARSessionNative.mm(730,32): error: use of undeclared identifier 'AROrientationTrackingConfiguration'
    ARConfiguration* config = [AROrientationTrackingConfiguration new];
    ^
    Libraries/Plugins/iOS/UnityARKit/NativeInterface/ARSessionNative.mm(891,12): error: use of undeclared identifier 'ARWorldTrackingConfiguration'
    return ARWorldTrackingConfiguration.isSupported;
    ^
    Libraries/Plugins/iOS/UnityARKit/NativeInterface/ARSessionNative.mm(896,12): error: use of undeclared identifier 'AROrientationTrackingConfiguration'
    return AROrientationTrackingConfiguration.isSupported;
    ^
    15 errors generated.
    + [arm64] Linking arkitscene...
    ld: file not found: build\Default\arm64\Libraries/Plugins/iOS/UnityARKit/NativeInterface/ARSessionNative.mm.obj
    Press any key to continue . . .
     
  28. SkaZonic

    SkaZonic

    Joined:
    Jun 20, 2017
    Posts:
    23
    Hey guys, hows it going? I have a few questions...

    1) When you start a AR session, and place objects at various positions and then afterwards end that ar session, you have to start the session again at that certain location what sort of workaround can be created to make it so you don't have to start at that certain location? I was thinking about the app running in the background constantly to determine your position and having the ar session also run in the background so when you 'reopen' the app (lets say its built in the phone to do this all the time ;) ) will you still have a accurate anchor for the ar scene or not? If so how would the code for something like this work? If this doesn't work (theres a better way) what other workarounds are there?

    2) Also I want to use the phones tracking and to display that in augmented reality, like this... hqdefault-1.jpg

    Its seems pretty simple but can someone direct me to how something like this can be accomplished? (The code for it will be appreciated as well, but I don't think it will be that hard of a implementation ;) )

    P.S The second question is the sort of ar session I will be running for question 1.

    Thank you so much for anyone who can help me out with this! you will definitely not regret it ;)
     
  29. TobiasW

    TobiasW

    Joined:
    Jun 18, 2011
    Posts:
    91
    Fixed it!

    Search for the YUVShader.shader file and replace it with this:

    Code (CSharp):
    1. Shader "Unlit/ARCameraShader"
    2. {
    3.     Properties
    4.     {
    5.         _textureY ("TextureY", 2D) = "white" {}
    6.         _textureCbCr ("TextureCbCr", 2D) = "black" {}
    7.         _texCoordScale ("Texture Coordinate Scale", float) = 1.0
    8.         _isPortrait ("Device Orientation", Int) = 0
    9.         _pow ("Adjusting power value", Float) = 1
    10.     }
    11.     SubShader
    12.     {
    13.         Tags { "RenderType"="Opaque" }
    14.         LOD 100
    15.  
    16.         Pass
    17.         {
    18.             ZWrite Off
    19.             CGPROGRAM
    20.             #pragma vertex vert
    21.             #pragma fragment frag
    22.            
    23.             #include "UnityCG.cginc"
    24.  
    25.             uniform float _texCoordScale;
    26.             uniform int _isPortrait;
    27.             float4x4 _TextureRotation;
    28.             float _pow;
    29.  
    30.             struct Vertex
    31.             {
    32.                 float4 position : POSITION;
    33.                 float2 texcoord : TEXCOORD0;
    34.             };
    35.  
    36.             struct TexCoordInOut
    37.             {
    38.                 float4 position : SV_POSITION;
    39.                 float2 texcoord : TEXCOORD0;
    40.             };
    41.  
    42.             TexCoordInOut vert (Vertex vertex)
    43.             {
    44.                 TexCoordInOut o;
    45.                 o.position = UnityObjectToClipPos(vertex.position);
    46.                 if (_isPortrait == 1)
    47.                 {
    48.                     o.texcoord = float2(vertex.texcoord.x, -(vertex.texcoord.y - 0.5f) * _texCoordScale + 0.5f);
    49.                 }
    50.                 else
    51.                 {
    52.                     o.texcoord = float2((vertex.texcoord.x - 0.5f) * _texCoordScale + 0.5f, -vertex.texcoord.y);
    53.                 }
    54.                 o.texcoord = mul(_TextureRotation, float4(o.texcoord,0,1)).xy;
    55.                
    56.                 return o;
    57.             }
    58.            
    59.             // samplers
    60.             sampler2D _textureY;
    61.             sampler2D _textureCbCr;
    62.  
    63.             fixed4 frag (TexCoordInOut i) : SV_Target
    64.             {
    65.                 // sample the texture
    66.                 float2 texcoord = i.texcoord;
    67.                 float y = tex2D(_textureY, texcoord).r;
    68.                 float4 ycbcr = float4(y, tex2D(_textureCbCr, texcoord).rg, 1.0);
    69.  
    70.                 const float4x4 ycbcrToRGBTransform = float4x4(
    71.                         float4(1.0, +0.0000, +1.4020, -0.7010),
    72.                         float4(1.0, -0.3441, -0.7141, +0.5291),
    73.                         float4(1.0, +1.7720, +0.0000, -0.8860),
    74.                         float4(0.0, +0.0000, +0.0000, +1.0000)
    75.                     );
    76.  
    77.                 return pow(mul(ycbcrToRGBTransform, ycbcr), _pow);
    78.             }
    79.             ENDCG
    80.         }
    81.     }
    82. }
    83.  
    Now search for the material you're using in UnityARVideo (on the camera). It's probably YUVMaterial. On that material, set "Adjusting power value" to 2.2 if you're using the Linear color space.

    I wonder if there's a way to automatically detect that in a shader?
     
    jimmya likes this.
  30. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Good one @TobiasW ! I'll put this change in the repo. I think we can test PlayerSettings.colorSpace to figure out what you should put in that value.
     
  31. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Make sure you have the latest plugin code, and get rid of any extra code from older versions of the plugin.
     
  32. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Please read the instructions on ARKITREMOTE.txt and follow the steps if you want to get the Remote working. And to get the tap working on Editor, look at EditorHitTest.cs
     
  33. TobiasW

    TobiasW

    Joined:
    Jun 18, 2011
    Posts:
    91
    Glad you like it! And yeah, that's a good idea - you could automatically refresh the material value from UnityARVideo.cs.

    I wonder if it would be more performant to have two static shader files (one without pow, and one with a fixed pow 2.2) though. Or shader variants. (I'm rather new to shaders.)
     
  34. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    @TobiasW you're right - I'll make two shader variants. Also, I realized this is not a common or dynamic operation - I'll just provide the new shader and material and let the developer choose which material will be used to clear.

    [update: changes are in BitBucket]
     
    Last edited: Sep 11, 2017
  35. wyattdesouza

    wyattdesouza

    Joined:
    Aug 14, 2015
    Posts:
    4
    I can't get the remote to work:
    I've built the Remote scene to my iPhone as a development build
    I'm running the EditorTestScene with the ARKitRemoteConnection Prefab in it.
    the iphone is plugged in an i am capable of connecting to iPhonePlayer(iPhone): 56000 or iPhone 7 Plus(60ea02)
    the ARKit scene is running on the plugged in phone

    if I connect to the iPhonePlayer i get:
    PhonePlayer(iPhone):56000</i> connected
    UnityEngine.XR.iOS.ConnectToEditor:EditorConnected(Int32)
    UnityEngine.Events.InvokableCallList:Invoke(Object[])
    UnityEngine.Events.UnityEvent`1:Invoke(T0)

    (Filename: /Users/builduser/buildslave/unity/build/artifacts/generated/common/runtime/DebugBindings.gen.cpp Line: 51)

    now when i click 'Start ARKit Remote Session' nothing seems to happen at all.

    Anyone have a solution for this?

    EDIT: I hadn't updated to the latest ios 11 beta
     
    Last edited: Sep 12, 2017
  36. kennyallau

    kennyallau

    Joined:
    Aug 5, 2014
    Posts:
    8
    Basically I just used YByteArrayForFrame(1-currentFrameIndex). It's quite straightforward really but not so obvious in the beginning. Yes sure, I would love to contribute to the plugin as well. How should I proceed with this?
     
    jomom likes this.
  37. skdev3

    skdev3

    Joined:
    Jul 15, 2015
    Posts:
    64
    Arvin6 likes this.
  38. TECNOLOGIA

    TECNOLOGIA

    Joined:
    Jan 23, 2015
    Posts:
    8
    Hi TobiasW, i solved it the same way, thanks for the answer, would be good to detect the color space and automatically change the pow value. I think this can be done using two shaders and with a script change the shader for the material matching the color space selected in editor.
    Best regards.
    Jesus.
     
    TobiasW likes this.
  39. skdev3

    skdev3

    Joined:
    Jul 15, 2015
    Posts:
    64
  40. anil02maths

    anil02maths

    Joined:
    Mar 15, 2016
    Posts:
    1
    Hi we used ARKit version 1.8, its a good plugin for AR. But in that sometime we getting flicker issue with main camera, its running backward - forward randomly in position. generate planes and point cloud particles start moving in uncertainly in any direction or position. We update with unity package ARKit 1.9, but issue has been increased. We made application for iOS. The flicker rate is more in iPhones in compare of iPad Pro.
    We used
    1.) Unity version unity 2017.1.0f3(64bit)
    2.) Xcode 9 beta 6
    3.) iOS 11 beta for testing
    So please provide the appropriate solution for it. Looking forward for your reply.
    Thanks in Advance
     
  41. John1515

    John1515

    Joined:
    Nov 29, 2012
    Posts:
    248
    Made some updates to the Model3 AR app

    The video doesn't really show the differences with the previous version, but the app now has a GPS based daylight system that drops some direct shadows (besides the baked projector shadow). The direct shadows are not shown in the movie clip, as my A9 core ipad couldn't do screen recording AND direct shadows together smoothly.. :oops:

    Another thing to consider is that the UI is super traditional. I should have made it at least partly in 3D in context. Maybe some day...
     
  42. HulloImJay

    HulloImJay

    Joined:
    Mar 26, 2012
    Posts:
    89
    Can anyone shed some light on using a depth mask to punch a virtual hole in the world camera view?

    I've set it up so far with two camera, first rendering the virtual geometry, and second the real world. A depth-mask object (just a cube for now) is set to render on the "real world" camera. The real camera is also set to clear depth only.

    This arrangement works if I'm rendering geometry to the real world camera (for example a plane with a texture fills the view except for the depth masked cube which shows the virtual beyond). It also works if I reverse it (making a portal from virtual world to the real. However, with the plugin as provided, the real world image is rendered in a different way I'm afraid I don't quite grasp, and the depth mask does not work. I just get the camera image/real world drawn 100% on top of the virtual.

    I'm using pixelplacement's depth mask shader which looks like this:
    Code (CSharp):
    1. // pixelplacement
    2. Shader "DepthMask"
    3. {
    4.     SubShader
    5.     {
    6.         Tags {"Queue" = "Transparent" "RenderType"="Transparent" }
    7.         Blend SrcAlpha OneMinusSrcAlpha
    8.         Lighting Off
    9.         ZWrite On
    10.         ZTest Always
    11.         Pass
    12.         {
    13.             Color(0,0,0,0)
    14.         }
    15.     }
    16. }
    Any clarification the devs could provide would be much appreciated. Especially if there is an easy solution available in the plugin :p
     
  43. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Make a fork/clone of latest on Bitbucket, make your changes to that fork, and submit a PR with your changes.
     
    kennyallau likes this.
  44. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Are you using the default UnityARKitScene? I have not seen this being reported before. If you have changed the offset of the camera, then something like this might happen. Or are you talking about the loss of tracking due to fast movement or low light?
     
    Last edited: Sep 11, 2017
  45. christophergoy

    christophergoy

    Joined:
    Sep 16, 2015
    Posts:
    735
    did you make sure the terrain was in the correct layer?
     
  46. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Maybe this project will give you some ideas: https://github.com/craigkj312/unity-arkit-portal or this one: https://blog.jayway.com/2017/08/04/arkit-and-unity/
     
    Last edited: Sep 11, 2017
    HulloImJay likes this.
  47. HaimBendanan

    HaimBendanan

    Joined:
    May 10, 2016
    Posts:
    28
    please someone :)?
     
  48. HulloImJay

    HulloImJay

    Joined:
    Mar 26, 2012
    Posts:
    89
  49. Aaron-Meyers

    Aaron-Meyers

    Joined:
    Dec 8, 2009
    Posts:
    305
    You can't submit an iOS 11 app to the app store until Apple release the GM version of Xcode 9, which will probably happen tomorrow after the event.
     
  50. Aaron-Meyers

    Aaron-Meyers

    Joined:
    Dec 8, 2009
    Posts:
    305
    When running an ARKit app goes into the background, what happens with ARKit tracking? My experience is that when returning from the background, objects placed based on the previous tracking data can be in totally different positions because the tracking has drifted while the app was in the bg.

    Am I missing any key features of ARKit that would help with this issue? Sometimes when the app returns from the background, the drift is minimal while others its unusable.

    Also, as far as I can tell, an iOS system dialog (your battery is at 20%) can report the app as having gone into the background, so its hard to distinguish between actual backgrounding because the app gets OnApplicationDidBecomeActive() in either situation.

    -A
     
    DerekLerner likes this.
Thread Status:
Not open for further replies.