Search Unity

  1. We are migrating the Unity Forums to Unity Discussions. On July 12, the Unity Forums will become read-only. On July 15, Unity Discussions will become read-only until July 18, when the new design and the migrated forum contents will go live. Read our full announcement for more information and let us know if you have any questions.
    Dismiss Notice
  2. Dismiss Notice

Resolved ARKit support for iOS via Unity-ARKit-Plugin

Discussion in 'AR' started by jimmya, Jun 5, 2017.

Thread Status:
Not open for further replies.
  1. idspe

    idspe

    Joined:
    Apr 12, 2016
    Posts:
    3
    Hi! How can I combine debugPlanePrefab + shadowPlanePrefab + occlusionPlanePrefab in one GO ?
     
  2. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Please read previous items in the forum that might have solution (e.g. camera usage description, etc). Then check XCode console log to see if there was a problem during run. If you still can't solve, please post your console log here and I can take a look.
     
  3. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Please check previous forum posts about Post Processing.
     
  4. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Please read TUTORIAL.txt on how the different portions work together. AddAnchor here is a delegate that is called from ARKit whenever a new plane anchor is found. Adding user anchors is not currently supported via plugin, but is in the works. You should really think hard about why you need user anchors in the ARKit scenario.
     
  5. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Check this post.
     
  6. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    This is just to make life easier on C# side. Certainly you could optimize the identifier to be a guid or some smaller data, but it will not help you much: you have a new plane found or updated or removed once every few thousand frames maybe?

    [edit: ill look into this optimization anyway]
     
    Last edited: Jul 13, 2017
    Alex_curiscope likes this.
  7. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    You could have the planePrefab mesh contain multiple materials? You may have to adjust the renderqueue of the shadow material to render last.

    You could optimize that by combining the shaders for the regular and shadow rendering as that can be done in one pass.
     
  8. idorurez

    idorurez

    Joined:
    Dec 28, 2015
    Posts:
    13
    Hi Jimmy,

    The shader breaks as soon as I add certain game objects like a volumetric capture. It appears that it doesn't necessarily obey the Camera shader's clear flags supplied by the camera shader. On most occasions, the shadow appears, but it does repeated draws on the shadow prefab plane of both the shadow and the camera feed, like stutter drawing. At other times, it will shade properly but the Camera has a problem keeping up with the camera move and a massive stuttering occurs again until I touch the screen again.

    Hard to explain, but I hope others can chime in.
     
  9. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Do shadows work for your "volumetric capture" in a regular unity scene? The shadow shader is not doing anything different from that - it just renders the result on an "invisible" mesh, multiplying by the contents of the framebuffer (which is the background video in this case).
     
  10. idorurez

    idorurez

    Joined:
    Dec 28, 2015
    Posts:
    13
    Shadows do work in a regular setting for my volumetric capture. The volumetric capture is using a standard material shader with only the albedo having a map. It's basically an animated mesh. Things work fine as long as I only include either this volumetric capture or the shadow plane, but not both. If both are included, the shadow works as long as the entire scene is encapsulated in an object (sphere, or a room), but everything outside has the stutter-render (ie not the YUVmaterial, just black).

    I've played with the render queues, and size of the plane to be sure, but no luck.

    I'll keep digging, and thank you for your quick response!
     
  11. idorurez

    idorurez

    Joined:
    Dec 28, 2015
    Posts:
    13
    Fixed the shadow problem!

    Realized that somehow the framebuffer wasn't properly getting filled with the YUV shader with all of my GOs in the scene, so I made sure to render it once more in a second camera:

    1. Create a Layer, name it "No Objects"
    2. Duplicate the Main Camera that has all of the ARKit components on it.
    3. Make that duplicate camera a child of the Main Camera
    4. Set the duplicate camera's depth to -2.
    5. Assign the Culling Mask the "No Objects" Layer.

    Not sure if this is required, but:
    5. Make sure the render queues for all of your objects are set appropriately.

    Seems to work for me now.
     
  12. Juan_Viewport

    Juan_Viewport

    Joined:
    Dec 7, 2016
    Posts:
    5
    I've found the solution, I hadn't updated to the iOS 11 Beta.
     
  13. willartii

    willartii

    Joined:
    Jul 9, 2017
    Posts:
    1
    Jimmy,

    I am a total beginner, and struggling with a basic concept. There is an asset for finger gestures in the Unity Store. see here... https://www.assetstore.unity3d.com/en/#!/content/41076

    My question is can you write Swift for it? If not, will ARKit recognize "fingers" asset instead of the "tap gesture" in ARKit. I would love clarity on this topic. Thanks for reading.
     
  14. xpkoalz

    xpkoalz

    Joined:
    May 31, 2014
    Posts:
    11
    hi chris,
    thanks for replying. I used the asset store version and updated Unity with the recommended patch unity 5.6.1.p1
     
  15. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    You should start with some unity beginner tutorials to get a feel for how to create with unity, maybe something like the ones here: https://unity3d.com/learn/tutorials/topics/interface-essentials
     
  16. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    As noted in earlier posts, we had some issues with XCode 9 Beta 3, so if you get latest asset store version (v1.0.1), it should be fixed. Let us know if you still encounter problems with latest stuff.
     
  17. Nitrousek

    Nitrousek

    Joined:
    Jan 31, 2016
    Posts:
    39
    I have read all the previous pages and did not find any single post about this. Also, this issue is not related to iOS itself, as Post Processing Stack, and others, do not cause issues in normal iOS builds.

    So I renew my question, if it is possible to obtain real-time post-processing like Ambient Occlusion, Color grading, etc, without huge FPS Drops (5 fps with just Ambient-Occlusion)
     
  18. wdrescher

    wdrescher

    Joined:
    Jun 29, 2017
    Posts:
    2
    @jimmya Thanks for all your previous help with shadows. I was able to take the test scene and apply it to anything in the parent just fine...

    I am now working with making a mini RC car in a scene and got the controls working and the car is driving just fine. The problem I am running in to is when I drag the collisionPlanePrefab into my scene and stretch it out large the car still falls off the plane. Once the camera detects a plan the car will only be able to drive on the plane detection box and falls of the sides. Is there a way to detect the plane and have it extend the edges all the way out so the car can be driven around the whole floor and not just in the detection box?
     
  19. Hazneliel

    Hazneliel

    Joined:
    Nov 14, 2013
    Posts:
    308
    I can confirm, the only previous post about post processing are people asking why they are having frame drops when using it. There is still not a solution on how to correctly use camera post processing.

    Has anybody figured out this problem?
     
  20. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    When ARKit detects the first plane, you would create a plane gameObject with the same transform as the anchor point found, but make it large enough to cover your floor. You can do this by hooking into the UnityARSessionNativeInterface.ARAnchorAddedEvent like it does in UnityAnchorManager.cs
     
  21. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    From what I can gather, Post Processing Stack has not been tested or optimized for mobile, and is really meant for cinematic effects.

    Post Processing Stack uses multiple full screen quads with expensive shaders on them to implement their functionality. ARKit uses a full-screen blit to render the video for the background. All these take a toll on the fillrate of the GPU, which on mobile is not that great. If you are able to run a GPU profiler and figure out a subset of the stack that you can use with mobile, that might be your best bet.
     
  22. rms80

    rms80

    Joined:
    Aug 2, 2015
    Posts:
    8
    Hi, I am trying to use ios 11 screen recorder to record some of the demo scenes included in the plugin, but the result is just a black video for the portion where the unity app is running. It works with other non-ARKit unity apps, so it is something about these specific demos. Any suggestions about why this might be happening and/or how to fix?

    I am using xcode 9 beta 3 and the most recent bitbucket source.
     
  23. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    There are lots of demos using the plugin that have been recorded via the screen recorder (see https://twitter.com/jimmy_jam_jam) - seems like you may have something different with your settings? Is anyone else seeing this?
     
  24. poolplayer32285

    poolplayer32285

    Joined:
    Jun 7, 2017
    Posts:
    2
    Jimmya,

    I'm running Beta 3 with the latest from unity asset store.

    Im getting failed builds when I get to xcode.

    Im running just the stock scenes and still getting this error. Please see screenshots.

    Can you help?
     

    Attached Files:

  25. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    It appears that you're running XCode 9 Beta 2 - you will need to upgrade to Beta 3.
     
  26. rms80

    rms80

    Joined:
    Aug 2, 2015
    Posts:
    8
    I normally use my phone in "Zoomed" mode. When I turn this off the screen recorder works. So, I guess that's the problem, not sure if it is on Apple side or Unity side. Screen recorder works w/ other non-ARKit Unity apps in Zoomed mode, but I don't have any non-Unity ARKit apps to test with.
     
  27. JCG2

    JCG2

    Joined:
    Jul 11, 2017
    Posts:
    1
    An ARKit build problem:

    I'm building the minimal ARKitScene in Unity 2017.1.0b8 with the default player settings (but using the Simulator SDK) in Xcode 9.0b3 trying both the iPhone 7 (11.0) and iPhone 7 (10.3.1) simulators but get 3 errors:

    1) /Users/jackgray/ArTest/ArBuild/Classes/Unity/UnityMetalSupport.h:20:20: Typedef redefinition with different types ('NSUInteger' (aka 'unsigned long') vs 'enum MTLPixelFormat')
    2) /Users/jackgray/ArTest/ArBuild/Classes/Unity/UnityMetalSupport.h:23:5: Redefinition of enumerator 'MTLPixelFormatBGRA8Unorm'
    3) /Users/jackgray/ArTest/ArBuild/Classes/Unity/UnityMetalSupport.h:24:5: Redefinition of enumerator 'MTLPixelFormatBGRA8Unorm_sRGB'

    It looks like the errors are because the directive UNITY_CAN_USE_METAL is not defined, but when I add the directive to the AOT Compiler Options I still get the same errors.

    I build for the Device SDK then Xcode succeeds with or without UNITY_CAN_USE_METAL, and resulting app loads onto my iPhone7 (10.3.2) Not surprisingly, on iOS 10.3.2 the app just shows a gray screen and generates this trace when I tap it.

    2017-07-14 21:37:21.434915-0400 arkitscene[3656:1080228] CGAffineTransformInvert: singular matrix.
    Jul 14 21:37:21 arkitscene[3656] <Error>: CGAffineTransformInvert: singular matrix.
    HitTest results: 0
    UnityEngine.XR.iOS.UnityARSessionNativeInterface:HitTest(ARPoint, ARHitTestResultType)
    UnityEngine.XR.iOS.UnityARHitTestExample:HitTestWithResultType(ARPoint, ARHitTestResultType)
    UnityEngine.XR.iOS.UnityARHitTestExample:Update()
    (Filename: /Users/builduser/buildslave/unity/build/artifacts/generated/common/runtime/DebugBindings.gen.cpp Line: 51)




     
  28. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Is there a question in there? Please look at the requirements for building and running ARKit. You have not satisfied the requirements, and therefore it will not work.
     
  29. Gamedevsss

    Gamedevsss

    Joined:
    Jul 15, 2017
    Posts:
    3
    Hi Jimmya,
    I just tried to build the sample project - vbut just get a black screen after tyeh unity logo - please see screen grab of Xcode .Please give me step by step how to build settings,camera settings.thanks in advance.
     
  30. Gamedevsss

    Gamedevsss

    Joined:
    Jul 15, 2017
    Posts:
    3
    Hi Jimmya ,
    I just tried to build the sample project - vbut just get a black screen after tyeh unity logo - please see screen grab of Xcode .Please give me step by step how to build settings,camera settings.thanks in advance.
     
  31. Grislymanor

    Grislymanor

    Joined:
    Aug 15, 2014
    Posts:
    23
    I have a problem with screen recording in which the recorded video is stretched and turned 90 degrees. I select "Landscape Left" in Unity, then in Xcode it shows "Landscape Right" (which I correct to left before compiling)... Has anybody else experienced this with Unity ARKit projects?
     
  32. Gamedevsss

    Gamedevsss

    Joined:
    Jul 15, 2017
    Posts:
    3
    i dot with Xcode 9 beta 3 not working camera in pro 2 generation
     
  33. lingoded

    lingoded

    Joined:
    Aug 22, 2015
    Posts:
    8
    I made some portals



    ----
     
    Last edited: Jul 15, 2017
    efge, JoeStrout, tannerhearne and 2 others like this.
  34. rms80

    rms80

    Joined:
    Aug 2, 2015
    Posts:
    8
    I don't think the screen recorder works at all in landscape. I have similar problem with a non-ARKit Unity app that is landscape-only...
     
  35. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Good stuff!
     
    lingoded likes this.
  36. tannerhearne

    tannerhearne

    Joined:
    Feb 13, 2015
    Posts:
    9
    Hey @jimmya — I am seeing a failed Xcode build with the following error in ARSessionNative.mm line 146 when I build in Xcode Version 9.0 beta 2 (9M137d):



    Commenting out lines 146-147 allows Xcode to successfully build. I think ARTrackingStateReasonInitializing may have been removed from Apple's ARKit Framework. Looking at the definitions within iOS 11.0 > Frameworks > ARKit > ARCamera.h starting on line 37 the following are listed:
    • ARTrackingStateReasonNone
    • ARTrackingStateReasonExcessiveMotion
    • ARTrackingStateReasonInsufficientFeatures


    There is no reference in ARCamera.h to ARTrackingStateReasonInitializing.

    Thanks,
    Tanner
     
  37. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Please read the previous posts - you will need to upgrade to XCode 9 Beta 3.
     
  38. jessevan

    jessevan

    Joined:
    Jan 3, 2017
    Posts:
    35
    Has anyone else noticed issues with the `gravityAndHeading` option? When I turn it on I get the behavior demonstrated in the video below? The planes appear to flicker into and out of existence even though there does seem to be some consistency between the planes when they are visible. Can anyone replicate this?

     
    Last edited: Jul 17, 2017
  39. Blarp

    Blarp

    Joined:
    May 13, 2014
    Posts:
    270
    Last edited: Jul 16, 2017
  40. jessevan

    jessevan

    Joined:
    Jan 3, 2017
    Posts:
    35
    That video appears to show that there is no camera tracking which would be an issue even if you didn't have a portal effect. It's not even doing 3dof tracking. Are you able to get something more simple working in your scene, a cube stuck to the ground, for example?
     
  41. Scorpio1990

    Scorpio1990

    Joined:
    Aug 31, 2013
    Posts:
    4
    Can Arkit plugins track the shadow?
     
  42. jessevan

    jessevan

    Joined:
    Jan 3, 2017
    Posts:
    35
    A follow up on my earlier post. I've made a new video with the example cube in the shot. Its interesting to note that the cube appears steady while the plane anchors appear to spin. I've also noticed that the plane anchors stop spinning when the phone is moving. When the phone comes to a rest, they begin to spin. I can't quite put my finger on it, but it sure feels like it has something to do with ARKit trying to adjust for the heading.

    Here is the new video:

     
  43. tannerhearne

    tannerhearne

    Joined:
    Feb 13, 2015
    Posts:
    9
    Thanks @jimmya — that fixed it!
     
  44. oliver-jones

    oliver-jones

    Joined:
    Nov 8, 2010
    Posts:
    25
    Is anyone having issues with the Camera's Clipping Planes Far? I believe its the Matrix called on Update;

    projectionMatrix = m_session.GetCameraProjection ();

    It appears to make the camera appear as the Far clipping is rather low - so it doesn't render anything in the midground or background.

    Any suggestions on how to fix?
     
  45. lusas

    lusas

    Joined:
    Mar 18, 2015
    Posts:
    6
  46. Inderdeep

    Inderdeep

    Joined:
    Sep 9, 2014
    Posts:
    2
    Hi @jimmya
    I have terrain set in my scene and as we can't rotate terrain, only option is to rotate camera and position relative to it. Is there any way to set custom camera rotation and position in scene? I basically want my terrain and whole scene in front of me on press of a button wherever I am facing. I have lightmap baked in terrain and objects. Initially I thought setting everything non static and rotate everything including light through script but terrain have no effect of rotation even if not static.
     
  47. amb3317

    amb3317

    Joined:
    Jun 24, 2016
    Posts:
    4
    Hey @jimmya,

    I had a Unity Project which was running smoothly on iOS (iPhone 7+). I then imported the ARKit package from the asset store, and have been running into strange bugs when I'm running the app on the iPhone, but not in the editor.

    Specifically, I'm receiving null reference exceptions when instantiating an object.

    I have the code:
    GameObject go = Resources.Load("World") as GameObject;
    Debug.Log(go.name);
    GameObject.Instantiate(go);

    The name of the gameObject will print correctly. But, Xcode reports a null reference error on the GameObject.Instantiate line. What's more, it's reporting the error as coming from a function which is never called in the codebase (this function is in the World.cs script attached to the World.prefab).

    The World.prefab is in a Resources folder, and everything works fine when I'm in the Editor (also when I'm using Unity Remote).

    Any help would be very much appreciated!

    Thank you!
     
  48. mimminito

    mimminito

    Joined:
    Feb 10, 2010
    Posts:
    780
    jimmya likes this.
  49. Aiursrage2k

    Aiursrage2k

    Joined:
    Nov 1, 2009
    Posts:
    4,835
    Okay how did you do that?
     
  50. tannerhearne

    tannerhearne

    Joined:
    Feb 13, 2015
    Posts:
    9
    Hey @lingoded — when you picked up the key, how did you tie the key to the camera's position and rotation? I'm trying to figure out how to have an object lock on to the camera's movements for a short time like that.
     
Thread Status:
Not open for further replies.