Search Unity

Resolved ARKit support for iOS via Unity-ARKit-Plugin

Discussion in 'AR' started by jimmya, Jun 5, 2017.

Thread Status:
Not open for further replies.
  1. christophergoy

    christophergoy

    Joined:
    Sep 16, 2015
    Posts:
    735
    There are optimizations coming that will, hopefully, reduce the power usage. This is a pretty computationally heavy operation though, so the heat/power usage of the device is expected to be high.
     
    4NDDYYYY likes this.
  2. christophergoy

    christophergoy

    Joined:
    Sep 16, 2015
    Posts:
    735
    Indoor navigation is a topic that is being heavily researched by large companies and, while AR seems like the most likely medium to visualize this, it is out of the scope of my knowledge. The only location based APIs that are available from Apple at the moment are the Core Location APIs.

    This doesn't mean you can't map out a place with your own data and do something that works. I imagine you could use ARKit as your mapping tool. Identify locations, and distances between them, save that data out. Then either in the same app or another, you could read that data, identify where you are, and then say you want to go somewhere else. Then you could draw a virtual line from where you have identified yourself to be, to where you want to go.

    This sounds simple, but I imagine it it a lot of work.
     
  3. christophergoy

    christophergoy

    Joined:
    Sep 16, 2015
    Posts:
    735
    Hi, without knowing your code, or having a video of reference, it is hard to know what is wrong with what you are doing. Could you provide more information?
     
  4. christophergoy

    christophergoy

    Joined:
    Sep 16, 2015
    Posts:
    735
    Hey, which shaders are on the materials that you are using? Are you sure you're changing the color on the correct game object? Can you provide more information and, possibly, code to show us what you are doing? Without any of this information it is hard for us to diagnose the issue and help you.
    Cheers,
    Chris
     
  5. macdude2

    macdude2

    Joined:
    Sep 22, 2010
    Posts:
    686
    @christophergoy Hey, not sure if this has been answered already, but it seems with the arkit remote, the touch input does not transfer over to the computer as well. Is this intended behavior or did I mess something up?
     
  6. christophergoy

    christophergoy

    Joined:
    Sep 16, 2015
    Posts:
    735
    You haven’t messed anything up :) the ARKIt Remote doesn’t transfer touches or hit tests at the moment.
     
    MrThee likes this.
  7. macdude2

    macdude2

    Joined:
    Sep 22, 2010
    Posts:
    686
    Hmmm, so then it's recommended to make a PC based input controller to test code? Wait, I also just saw hit test is unsupported as well? So I guess we'll just have raycast on the planes then?
     
    Last edited: Aug 3, 2017
  8. caglarmehmetmetin

    caglarmehmetmetin

    Joined:
    Aug 3, 2017
    Posts:
    4
    Hello guys! Currently I started to use this plug-in and I appreciate this. Well done!!
    I have a question. I could not find how I can get the config variable of session.
    I understand how to create a config and run with ARSession. However after I run the config, I want to change it later. I couldnt find how I can change it. Basically after some point I want to disable the plane detection but I cannot get the config variable, which holds planeDetection info, from ARSessionNativeInterface. Any help would be appreciated. Thanks a lot!
     
  9. r3000

    r3000

    Joined:
    Aug 2, 2017
    Posts:
    12
    Thanks. This advice enabled progress!
    Updated to 2017.2, imported the latest version of plugin. It compiles and runs! We even have a "Start Remote ARSession". But clicking doesn't trigger the real start (no green / life feed from device)...

    How to solve the Player Connection error?
    Searching by error code mentions fix about editing the manifest but am unable to navigate to those variables through (my understanding of) the unity editor.

    Imagine this will make the development cycle much faster for testing things like 'how well a model spawns into the local dev environment.

    PS. is anyone good with snow?
    Been trying to tweak particle editor so that my snow falls more like the snow here – i.e. directly down in rays rather than flurries in flurries like the vimeo video.
     
    Last edited: Aug 4, 2017
  10. r3000

    r3000

    Joined:
    Aug 2, 2017
    Posts:
    12
    Know how to solve this Player Connection issue?
    Trying to setup basic 'launch animation into session' app. Seems that's the coolest thing ARKit does – load animations into your scene. In this case testing with trying to load a waterfall (particle effect) onto my coffee table. However, it – does not connect (see vid). How does one go about getting ARKit to load in an animation like a water effect like this?



    This is the log:

    <i>Autoconnected Player</i> TypeLoadException: A type load exception has occurred.
    at System.Runtime.Serialization.Formatters.Binary.ObjectReader.GetDeserializationType (Int64 assemblyId, System.String className) [0x00000] in <filename unknown>:0
    at System.Runtime.Serialization.Formatters.Binary.ObjectReader.ReadTypeMetadata (System.IO.BinaryReader reader, Boolean isRuntimeObject, Boolean hasTypeInfo) [0x00000] in <filename unknown>:0
    at System.Runtime.Serialization.Formatters.Binary.ObjectReader.ReadObjectInstance (System.IO.BinaryReader reader, Boolean isRuntimeObject, Boolean hasTypeInfo, Int64& objectId, System.Object& value, System.Runtime.Serialization.SerializationInfo& info) [0x00000] in <filename unknown>:0
    at System.Runtime.Serialization.Formatters.Binary.ObjectReader.ReadNextObject (System.IO.BinaryReader reader) [0x00000] in <filename unknown>:0
    at System.Runtime.Serialization.Formatters.Binary.ObjectReader.ReadObjectGraph (BinaryElement elem, System.IO.BinaryReader reader, Boolean readHeaders, System.Object& result, System.Runtime.Remoting.Messaging.Header[]& headers) [0x00000] in <filename unknown>:0
    at System.Runtime.Serialization.Formatters.Binary.BinaryFormatter.NoCheckDeserialize (System.IO.Stream serializationStream, System.Runtime.Remoting.Messaging.HeaderHandler handler) [0x00000] in <filename unknown>:0
    at Utils.ObjectSerializationExtension.Deserialize[T] (System.Byte[] byteArray) [0x00000] in <filename unknown>:0
    at UnityEngine.XR.iOS.ConnectToEditor.HandleEditorMessage (UnityEngine.Networking.PlayerConnection.MessageEventArgs mea) [0x00000] in <filename unknown>:0
    at UnityEngine.Events.InvokableCall`1[T1].Invoke (System.Object[] args) [0x00000] in <filename unknown>:0
    at UnityEngine.Events.InvokableCallList.Invoke (System.Object[] parameters) [0x00000] in <filename unknown>:0
    at UnityEngine.Networking.PlayerConnection.PlayerEditorConnectionEvents.InvokeMessageIdSubscribers (Guid messageId, System.Byte[] data, Int32 playerId) [0x00000] in <filename unknown>:0
    at UnityEngine.Networking.PlayerConnection.PlayerConnection.MessageCallbackInternal (IntPtr data, UInt64 size, UInt64 guid, System.String messageId) [0x00000] in <filename unknown>:0

    (Filename: currently not available on il2cpp Line: -1)
     
    Last edited: Aug 4, 2017
  11. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    Did you build the UnityARKitRemote scene to your device from the latest project? It appears that the Remote side is expecting a different format of data from the one you're sending.
     
  12. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    You can call RunWithConfig (or RunWithConfigAndOptions) again with a new session config and/or runoptions, and it will restart the AR Session with the new config/options.
     
  13. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    One thing you could do is use a cross-platform touch input framework like TouchScript (written by our very own Valentin Simonov), or use mouse input which iOS simulates via touch. This way your code will work in Editor as well as device.
     
  14. r3000

    r3000

    Joined:
    Aug 2, 2017
    Posts:
    12
    Nope – (or at least don't think so). I built the UnityARKitRemote scene to my device from a prior project. So I go onto my iOS device and launch ARKitRemote app. Then I opened the default waterfall scene (which i labelled ArKit Waterfall 3D) and then imported ARKit plugin to that scene.
     
  15. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    You misunderstood me. Both the project you are trying to connect to Remote and the Remote itself should be built with the same version of Unity ARKit Plugin. Otherwise the data structures will be out of sync.
     
  16. caglarmehmetmetin

    caglarmehmetmetin

    Joined:
    Aug 3, 2017
    Posts:
    4
    Hello, thanks for the answer. When I restart AR Session, will there be some information loss? I mean, Anchors, Planes etc..

    I have one more question. When I do HitTest, it returns false until first plane created, which is good. But after there are some planes, where-ever I touch, HitTest returns true. Is there a reason for this behaviour?

    Thanks a lot!
     
  17. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    You can control what information is reset or not by using RunOptions. See the description of the kinds of HitTest you can query for (you may be using a test for an infinite plane?)
     
  18. r3000

    r3000

    Joined:
    Aug 2, 2017
    Posts:
    12
    Thanks. I started a new project, imported the latest ARKit plugin from asset store. Built ARKitRemote to my iphone 6s. Created a new scene. Inserted ARKitRemoteConnection into root Assets folder. Imported the snow package and pulled the pre-fab into the scene. Clicked Play... the Start ARKit Session button appears. It asks for camera access on my phone. But then nothing happens. Continues to show 'Waiting for editor connection' on phone.

    Understand that my above workflow should avoid any issues around using different plugins. Or am I missing something?
    Any ideas why it's not working / how to get it working so that i can tap a plane through the phone to launch a waterfall (using unity dev environment)?
     

    Attached Files:

  19. infocom_lab

    infocom_lab

    Joined:
    May 20, 2015
    Posts:
    1
    Hi

    In UnityAppController.h seems to be using UIView instead of ARSCNView.
    How to do it
    convert RawFeaturePoints to CGPoint = ARSCNView .projectPoint .

    Thanks.
     
  20. caglarmehmetmetin

    caglarmehmetmetin

    Joined:
    Aug 3, 2017
    Posts:
    4
    Hello again! I really appreciate your helps sir! They really helped me.
    I have one more question :)

    I use the HitTestExample with the given small cube. It works perfectly. When I walk around that small grey cube, it remains at the same position perfectly.

    I add another model, 1 unit left to the this cube. When I touch both are placed to the touched coordination perfectly but when I move around the model that I use is not stable anymore. The cube stills remains perfectly at the touched position. But the model moves when change the direction or when move a bit. Sometimes it is stable, but most of the time is so unstable.

    Do you have any idea?

    Thanks a lot! Have a nice day!
     
  21. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    You need to add MainCamera and ARCameraManager gameobjects from EditorTestScene to your new scene.
     
  22. Bird_LYKKE

    Bird_LYKKE

    Joined:
    Jul 26, 2017
    Posts:
    15
    Can you help me with these In Xcode
    2017-08-04 20:03:49.815137+0700 xxxxx [1328:263105] [Technique] World tracking performance is being affected by resource constraints [1]
     
  23. r3000

    r3000

    Joined:
    Aug 2, 2017
    Posts:
    12
    Ok so loaded the EditorTestScene then copied MainCamera and ARCameraManager and then loaded up the other scene again, pasted, and tried to get AR Remote working but still not functioning. Any thoughts?

     
  24. MSFX

    MSFX

    Joined:
    Sep 3, 2009
    Posts:
    116
    ^ does the default "UnityARKitScene" test scene work? Have you tried that? Why don't you just add your waterfall to that scene?
     
  25. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
  26. mmortall

    mmortall

    Joined:
    Dec 28, 2010
    Posts:
    89
    Please add DontDestroyOnLoad class into namespace. I have conflicts with my own code. Thanks
     
  27. mmortall

    mmortall

    Joined:
    Dec 28, 2010
    Posts:
    89
    Is it possible to decrease bitrate of captured video in AR Remote mode? iPhone is heating a lot and video is running very slow in remote mode.
     
  28. r3000

    r3000

    Joined:
    Aug 2, 2017
    Posts:
    12
    Yes it works! There's a waterfall in my house! LOL
     
  29. pgeorges-dpt

    pgeorges-dpt

    Joined:
    Apr 7, 2016
    Posts:
    43
    Hi,

    I'm trying to get ARKit to work in my project and for some reason perspective is messed up. I run the examples built from my project and they work fine. When I copy the managers over to my own scene, taking care of making sure all the settings on the camera are correct and this this camera is linked where it must be. When running the scene, the objects I track feel like they are being seen from a camera with a really small field of view. I don't get what's happening. I added the box with with the direction gizmos from the hit test example scene to my scene and it isn't showing like in the hit test example. Any ideas? Does a multi-camera setup screw with ARKit?

    Cheers,

    Paul
     
    Last edited: Aug 6, 2017
  30. janla

    janla

    Joined:
    May 8, 2013
    Posts:
    15
    @jimmya:
    I'm having the same problem (and can also confirm the spinning planes) Any news on this?

    Thanks!
     
  31. lourd

    lourd

    Joined:
    May 20, 2017
    Posts:
    15
    Hey Chris, thanks for the reply. The model is just using a Standard shader. Here's the relevant snippet that isn't working for me:

    GameObject obj = Instantiate (modelPrefab, atPosition, Quaternion.identity);
    MeshRenderer renderer = obj.GetComponent<MeshRenderer> ();
    renderer.material.color = Color.blue;

    The object is white, the color set for the Albedo value for the prefab in the inspector, instead of the expected blue.
     
  32. r3000

    r3000

    Joined:
    Aug 2, 2017
    Posts:
    12
    Anyone got hit testing work with the ARKit plugin (for faster iteration)?

    Can get remote working but unable to use hit test to drop a plane/object in the iphone (connected via ArKit remote) viewport. Reason this would be valuable is because i'm trying to figure out how to get a snowing cloud to spawn & scale appropriately in the real world environment when the screen is touched on the ARKit device.
    THanks!
     
    Last edited: Aug 6, 2017
  33. marcipw

    marcipw

    Joined:
    Apr 18, 2013
    Posts:
    239
    I was planning on jumping into ARKIT sooner or later but a meeting with one of my clients on Friday spurred me on to get some compatible hardware, in the form of a shiny new iPad Pro! I am very excited to start working with this revolutionary AR milestone!
     
    jimmya likes this.
  34. John1515

    John1515

    Joined:
    Nov 29, 2012
    Posts:
    248
    Hi

    Am trying to orient a compass in AR to the Magnetic North in real life :)
    So far I have this code but is not always working:

    Code (CSharp):
    1. using System.Collections;
    2. using System.Collections.Generic;
    3. using UnityEngine;
    4.  
    5. public class OrientNorthSimple : MonoBehaviour {
    6.  
    7.  
    8.     void Start(){
    9.         Input.compass.enabled = true;
    10.         StartCoroutine(Example());
    11.     }
    12.  
    13.     IEnumerator Example()
    14.     {
    15.        
    16.         yield return new WaitForSeconds(3);
    17.         transform.rotation = Quaternion.Euler(0, -Input.compass.magneticHeading, 0);
    18.         Debug.Log (transform.eulerAngles + " " + Input.compass.magneticHeading);
    19.  
    20.     }
    21. }
    Right now it uses one measurement which is tricky with the phone's compass as it could be wrong. Would like to monitor for a while and use the average.
    Also, the Camera's Y axis is not reliable when the phone is facing the floor...
    Any ideas? :)
     
  35. lourd

    lourd

    Joined:
    May 20, 2017
    Posts:
    15
    Have you tried doing the rotation setting in `Update` instead of in a coroutine? What's your reason for not updating it every frame?
     
  36. quitebuttery

    quitebuttery

    Joined:
    Mar 12, 2011
    Posts:
    329
    So is there a good way to pause or otherwise stop the plane generation but keep tracking the existing ones? I noticed there's a pause method in the plugin, but no resume. And I guess it's suggested here that there's no real concept of pausing ARKit. I'd like to stop plane generation once the game begins--essentially keeping all the existing anchors where they are.
     
  37. John1515

    John1515

    Joined:
    Nov 29, 2012
    Posts:
    248
    That's what I thought first, but actually, you only need to do it once, because afterwards the tracking of ARkit does all the rotation.
    It is also possible to do it all the time, but because of the iPhones internal compass it would change a bit all the time, thus move. For my use-case it is better if it is set once -does not need to be super accurate- and then it should not rotate anymore.

    My script almost does that, but it is difficult when that one measurement is wrong..

    Coroutine is btw because the compass doesn't necessarily return a value directly.. it needs to "warm up". Oh well.. maybe it is good enough now that I think about it :)
     
  38. phits

    phits

    Joined:
    Aug 31, 2010
    Posts:
    41
    I am getting the following error with the EditorTestScene, Failed to connect to player. See attached screenshot. I started with a new project and running Unity 2017.1.0f3 and Xcode 9.0 beta 4. Any suggestions on how to get it to work?
     

    Attached Files:

  39. John1515

    John1515

    Joined:
    Nov 29, 2012
    Posts:
    248
    I would really like to show (virtual) reflections on the (real) floor. I can't find any way to do this :(
    Something similar to the mobileARshadows shader..
     
  40. phits

    phits

    Joined:
    Aug 31, 2010
    Posts:
    41
    I figured it out. I didn't have the remote app running on the iPhone.
     
  41. GMM

    GMM

    Joined:
    Sep 24, 2012
    Posts:
    301
    Personally i think you should avoid reflection due to them being expensive and have a high probability of making things looking a bit more unrealistic than intended since you know nothing about the surface you are hitting. If you always know the type of surface you will be projecting to(for a demo or usage in a controlled environment), then it could be a good idea to implement.a feature like that.

    You could start to look at something like this and do the necessary changes:
    http://wiki.unity3d.com/index.php/MirrorReflection3

    The wiki site is down at the moment though, so use this link in the meanwhile:
    http://webcache.googleusercontent.c...p/MirrorReflection3+&cd=1&hl=en&ct=clnk&gl=dk
     
    John1515 likes this.
  42. Bird_LYKKE

    Bird_LYKKE

    Joined:
    Jul 26, 2017
    Posts:
    15
    if UnityARVideo can not get picture from device camera
    How do I know that?(in coding)
     
  43. quitebuttery

    quitebuttery

    Joined:
    Mar 12, 2011
    Posts:
    329
    I would be cool to build a cube map out of the video feed on the fly and use that for reflections. Won't be super accurate, but if the surface is rough enough it wouldn't really matter. Would be great for lighting as well--for IBL shaders.
     
  44. GMM

    GMM

    Joined:
    Sep 24, 2012
    Posts:
    301
    I have created a thing that uses the camera feed to create a pseudo reflection that integrates with the standard unity shading pipeline:


    I still think its a bit too slow for mobile devices and the effect can be a bit harsh on very reflective surfaces (as the reflection per nature is not correct due to it never having the full 360 picture), but it can absolutely be done.
     
    John1515 likes this.
  45. PdrMnts

    PdrMnts

    Joined:
    Dec 12, 2016
    Posts:
    13
    Anyone having problems recording in Landscape with the Screen Recorder?? From last update of ios11 the saved video gets frozen. If i record in portrait the screen recorder works fine but my Unity-app is in Landscape so ...
     
  46. shazamsoyeux

    shazamsoyeux

    Joined:
    Jun 14, 2016
    Posts:
    3
    I'm looking to switch from Vuforia to ARKit but I'm having problems with the rotations when I'm switching to ARKit. How did you manage to do it ?
     
  47. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    I think this was answered before: but you can restart via RunWithConfigAndOptions with config not doing plane detection, and runoptions keep existing anchors.
     
    SourceKraut likes this.
  48. jimmya

    jimmya

    Joined:
    Nov 15, 2016
    Posts:
    793
    You might also want to look into "GravityAndHeading" setting in the startup config. It seems to currently mess up with the planes detected, but once you have established a starting orientation, you may be able to restart with "Gravity" setting in config.
     
    John1515 likes this.
  49. John1515

    John1515

    Joined:
    Nov 29, 2012
    Posts:
    248
    hi thanks, While they are expensive indeed, ARKit only runs on high end devices anyway that can handle these kind of reflections quite well, A9 and A10 chip are very powerful. The realism is depending on what the developer does with it.

    I am looking for this effect (with the blurry-ness, I know that is really hardcore):


    I think Unity should pay attention to materials that are common in CGI packages in order to blend real life footage and virtual scenes.
    In 3DS Max there is a material called Matte, that does what mobileshadow and mobileocclusion does in Unity's ARKit plugin, but it also does a whole lot more things.

    For developing great ARKit experiences, it would be awesome if Unity had an allround "special effects" material available in an easy-to-use way. It should handle shadows, occlusion, possibly bumpmaps, reflections and maybe some sort of transparency. I think it would even be possible to do refractions of the underlying videofeed and distort the videofeed with a normalmap.


    Allthough not really ARKit related, I do experience problems with horizontal video having a wrong aspect ratio. hopefully Apple will fix this later.
     
  50. DavidErosa

    DavidErosa

    Joined:
    Aug 30, 2012
    Posts:
    41
    Same here. Recording a video of my app in landscape creates a portrait video with the wrong aspect. I guess this is related to the latest iOS beta (4 right now), because I can't recall having this problem with beta 3.
     
Thread Status:
Not open for further replies.