Search Unity

AR Foundation Editor Remote | Test and debug your AR project in the Editor

Discussion in 'Assets and Asset Store' started by KirillKuzyk, May 26, 2020.

  1. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    757

    AR Foundation Remote 2.0 is available now on Asset Store!

    Read the blog post


    AR Foundation Editor Remote (v1.0)
    AR Foundation Editor Remote (v1.0) is not going anywhere and is still an essential AR debugging tool for years to come. Existing customers will receive the same high-quality support as before and can upgrade to v2.0 anytime at a discount.​

    Plugin blog posts: #1 #2 #3 #4 #5 #6 #7 #8 #9 #10 #11 #12


    Debugging any AR project is a nightmare. Currently, you're required to wait for a build after any minor code change. This is annoying and unproductive.

    I developed a plugin that enables you to stream a remote session from your mobile to Editor. There were several old implementations that worked with ARKit but my solution is based on AR Foundation and works seamlessly with ARKit and ARCore. Theoretically, the plugin should support meshing on Magic Leap and Holo Lens too. I have no way to prove it without the actual devices, so if you have one, your help would be very appreciated (Holo Lens, Magic Leap).



    Fast iterations are crucial for development. But currently, you're required to make a new build after any minor change. And builds take a looooong time even for small projects. Now you have the solution!

    AR Foundation Editor Remote is an Editor extension that allows you to transmit AR data from AR device to Unity Editor. Run and Debug AR projects right in the Unity Editor!


    Current workflow with AR Foundation

    1. Make a change to your AR project.
    2. Build project to a real AR device.
    3. Wait for the build to complete.
    4. Wait.
    5. Wait a little bit more.
    6. Test your app on a real device using only Debug.Log().


    Improved workflow with AR Foundation Editor Remote

    1. Setup the AR Companion app once. The setup process takes less than a few minutes.
    2. Make a change to your AR project.
    3. Just press play! Run and debug your AR app with full access to scene hierarchy and all object properties right in the Editor!


    Features

    • Precisely replicates the behavior of a real AR device in Editor.
    • Supports all AR Foundation platforms. Extensively tested with ARKit and ARCore.
    • Plug-and-play: no additional scene setup is needed, just run your AR scene in Editor with AR Companion runnning. Extensively tested with scenes from AR Foundation Samples repository.
    • Streams video from Editor to real AR device so you can see how your app looks on it without making a build (see Limitations).
    • Multi-touch input remoting: test multi-touch input or simulate touch using mouse in Editor (see Limitations).
    • Written in pure C# with no third party libraries or native code. Adds no performance overhead in production. Full source code is available.
    • Connect any AR Device to Windows PC or macOS via Wi-Fi: iOS + Windows PC, Android + macOS... any variation you can imagine!
    • Supports wired connection on iOS + macOS.


    ⚡ Supported AR subsystems ⚡

    • Meshing (ARMeshManager): physical environment mesh generation, ARKit mesh classification support.
    • Occlusion (AROcclusionManager): ARKit depth/stencil human segmentation, ARKit/ARCore environment occlusion (see Limitations).
    • Face Tracking: face mesh, face pose, eye tracking, ARKit Blendshapes.
    • Body Tracking: ARKit 2D/3D body tracking, scale estimation.
    • Plane Tracking: horizontal and vertical plane detection, boundary vertices, raycast support.
    • Image Tracking: supports mutable image library and replacement of image library at runtime.
    • Depth Tracking (ARPointCloudManager): feature points, raycast support.
    • Camera: camera background video (see Limitations), camera position and rotation, facing direction, camera configurations.
    • Camera CPU images: ARCameraManager.TryAcquireLatestCpuImage(), XRCpuImage.Convert(), XRCpuImage.ConvertAsync() (see Limitations).
    • Anchors (ARAnchorsManager): add/remove anchors, attach anchors to detected planes.
    • Session subsystem: Pause/Resume, receive Tracking State, set Tracking Mode.
    • Light Estimation: Average Light Intensity, Brightness, and Color Temperature; Main Light Direction, Color, and Intensity; Exposure Duration and Offset; Ambient Spherical Harmonics.
    • Raycast subsystem: raycast against detected planes and cloud points (see Limitations).
    • Object Tracking: ARKit object detection after scanning with scanning app (see Limitations).
    • ARKit World Map: full support of ARWorldMap. Serialize current world map, deserialize saved world map and apply it to current session.


    Requirements

    • Unity >= 2019.2.
    • AR Device (iPhone with ARKit support, Android with ARCore support, etc.).
    • AR Device and Unity Editor should be on the same Wi-Fi network (a wired connection is supported on iOS + macOS).
    • AR Foundation >= 3.0.1.


    Limitations

    • Please check that your AR device supports the AR feature you want to test in Editor. For example, to test Meshing in Editor, your AR device should support Meshing.

    • Video streaming and occlusion textures:
    - Are supported only if Editor Graphics API is set to Direct3D11 or Metal.
    - Default resolution scale is 0.33. You can increase the resolution in the plugin's Settings, but this will result in higher latency and lower framerate.
    - Windows Unity Editor 2019.2: video and occlusion are not supported.

    • Raycast subsystem: ARRaycastManager is implemented on top of ARPlaneManager.Raycast() and ARPointCloudManager.Raycast(). Please add ARPlaneManager to your scene to raycast against detected planes and ARPointCloudManager to raycast against detected cloud points.

    • Touch input remoting and simulation:
    - Only Input Manager is supported (UnityEngine.Input).
    - Unity 2019.2: please add this line on top of every script that uses UnityEngine.Input:
    using Input = ARFoundationRemote.Input;
    - Unity 2019.2: UI system will not respond to touch events. Please use your mouse to test UI in Editor.

    • ARKit Object Tracking:
    - Adding a new object reference library requires a new build of the AR Companion app.
    - Setting arSession.enabled = false after arSession.Reset() will on rare occasions crash the AR Companion app because of this bug.

    • Camera CPU images:
    - ARCameraManager.TryAcquireLatestCpuImage() is not synchronized with the latest camera position.
    - Only one XRCpuImage.ConvertAsync() conversion is supported at a time.
    - CPU image conversions produce successful results with delays (after several frames).
    - Occlusion CPU images (TryAcquireHumanStencilCpuImage, TryAcquireHumanDepthCpuImage, TryAcquireEnvironmentDepthCpuImage, TryAcquireEnvironmentDepthConfidenceCpuImage) are NOT supported. As an alternative, you can use Graphics.Blit() to copy textures and access them on CPU (see Texture2DSerializable.cs for example).



    Video review from Dilmer Valecillos:
     
    Last edited: Sep 28, 2021
  2. Render_Man

    Render_Man

    Joined:
    May 29, 2019
    Posts:
    2
    Works well. No hassles in integration. This is an incredible time saver.

    AR is particularly difficult to iterate and desperately needs a good, high frequency remote. The update rate of this remote is very high and is suitable for any kind of AR work. I did not see significant latency either.

    Real time video playback is a nice to have but totally functional without it.
     
  3. gandhars

    gandhars

    Joined:
    Apr 2, 2015
    Posts:
    3
    Hello All,
    this asset works like charm and it's big time saver for me. With this tool you never will need build all the application for testing AR .
    I had small trouble with GIT (I didn't have it ) and plugin needs the GIT for downloading the package.
    Support of the author is also great, I had some problem with my code ( raycasting part ) and I got a lot advice on how to solve it.
    Radek Hart
     
  4. alesandramq

    alesandramq

    Joined:
    Aug 20, 2017
    Posts:
    1
    Installation was straight forward to get it working on both Android and iOS, but I did come across a few bugs or maybe features that have not yet been implemented.
    For instance, I had some mobile screen touch inputs that generate more game objects, these were not working when running, I would tap on my phone screen and nothing would spawn. I however changed them to PC controls (on click) and it worked only in the Unity Editor and not on the phone. Moreover the camera was not being displayed in the Unity editor either, just a black background, it did however show the camera on the phone.
    Nevertheless these where minor inconveniences, once I changed the interaction to PC clicks I was able to debug and program much quicker and without hassle, so thank you for that sir!
     
  5. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    757
    Huge thanks for the feedback!

    I'm sorry for the git part, I didn't think Unity Package Manager does not already contain it.

    Input methods for Editor and mobile are different and you should handle them differently. My plugin manages only a remote part. I'm glad you managed to fix the issue in the end.

    UPDATE:
    Touch input remoting is already available starting from version 3.5.1:
    https://forum.unity.com/threads/ar-...ject-in-the-editor.898433/page-2#post-5986280

    The camera background is not currently implemented. In my experience, the camera background is not a crucial part of Editor testing.

    UPDATE:
    Camera background is supported starting from version 3.8.9:
    https://forum.unity.com/threads/ar-...ject-in-the-editor.898433/page-2#post-6056636
     
    Last edited: Jul 9, 2020
    unnanego likes this.
  6. jgmakes

    jgmakes

    Joined:
    Jan 10, 2019
    Posts:
    62
    It seems like supporting the input methods that are already coded into our apps would be important. Is there a way to automatically capture the mobile touch events and spoof the equivalent desktop clicks? Otherwise users of your remote will need to write a bunch of extra code to make their apps interactions testable in the remote.
     
    multimediamarkers and soorya696 like this.
  7. soorya696

    soorya696

    Joined:
    Dec 13, 2018
    Posts:
    71
    +1
     
    multimediamarkers likes this.
  8. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    757
    This idea is great! But, unfortunately, there is no simple solution on how to simulate touches in Editor.

    1. Unity already translates Input.GetMouseButtonDown(0), Input.GetMouseButton(0) and Input.GetMouseButtonUp(0) into touch events on mobile. You can use these methods if your app does not handle multiple touches at once.

    2. There is no way to substitute Input.touches with another implementation for Editor. While I can write an InputWrapper class that will translate mouse events to touch gestures, users will be required to replace all usages of Input class with InputWrapper. This is not a great solution because your code will now become dependent on my plugin. In addition to that, this method will still not be able to simulate multitouch.

    3. If your app needs to handle multitouch correctly on all platforms, there is a free TouchScript plugin. It works perfectly and can simulate multitouch in the Editor. If you look into its source code, you'll understand that touch simulation in Editor is far from an easy task :)

    To sum it all up, I wanted my plugin to solve one problem and solve it good.
    I will think about how to write the InputWrapper class so it has the smallest impact on existing projects.


    UPDATE:
    Touch input remoting is already available starting from version 3.5.1:
    https://forum.unity.com/threads/ar-...ject-in-the-editor.898433/page-2#post-5986280
     
    Last edited: Jul 9, 2020
  9. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    757
    Here is the script to handle touches in Editor. It simulates only one-finger gestures.
    I'll add it to the next version of the plugin.

    It is the cleanest solution that requires the least amount of changes to your codebase and requires only this line on top of all scripts that uses UnityEngine.Input:
    using Input = ARFoundationRemote.Input;
     

    Attached Files:

    Last edited: May 28, 2020
  10. makaka-org

    makaka-org

    Joined:
    Dec 1, 2013
    Posts:
    513
    Hi, What about Face Tracking?
     
  11. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    757
    Face tracking is on the roadmap. It seems like it's the most requested feature.
     
    ina likes this.
  12. olid

    olid

    Joined:
    Jan 10, 2019
    Posts:
    1
    Great, just what I needed – dunno why unity dropped the arRemote in 2019.. I am just starting AR for a new project and i think this is essential for the dev/build/test cycle
     
  13. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    757
    PLUGIN UPDATE POST #1


    Hello again, AR remoters!
    The new AR subsystem has arrived!

    The whole last week I was developing Face Tracking Remote and here it is in action:


    Supported features:
    • face mesh
    • face pose
    • eye tracking
    Please tell me what other subsystems you would like to see in the future :)


    All current users will receive the update with Face Tracking for free.


     
    Last edited: Jul 28, 2020
    petey, makaka-org and fherbst like this.
  14. makaka-org

    makaka-org

    Joined:
    Dec 1, 2013
    Posts:
    513
    Cool. And What about Android?
     
  15. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    757
    The short answer: Android is supported :)

    The long answer: the plugin is platform-agnostic so if your Android device supports face tracking, then Face Tracking will work in Unity Editor.
     
    makaka-org likes this.
  16. zulaman

    zulaman

    Joined:
    May 12, 2017
    Posts:
    26
    This is really awesome. Would it be possible to add the FaceKit Blendshape feature? It's the only AR feature we really need so if you add it, I'll definitely buy it.
     
  17. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    757
    zulaman, yes, iOS Blendshapes are possible! If everything goes fine, I'll release them tomorrow. They are already working, I just need to make minor refinements.
     
    fherbst likes this.
  18. jpvanmuijen

    jpvanmuijen

    Joined:
    Aug 23, 2012
    Posts:
    13
    I'm having trouble with raycasting too. Could you elaborate a bit on your case?
    Basically, I'm just trying to cast a ray from the center of the screen to a plane and have something on that location. Nothing major I would assume.
    Code (CSharp):
    1.     public GameObject placementIndicator;  
    2.     private ARRaycastManager _arRaycastManager;
    3.     public Camera currentCam;
    4.     private Vector2 screenCenter;
    5.  
    6.     static List<ARRaycastHit> hits = new List<ARRaycastHit>();
    7.  
    8.     private void Awake()
    9.     {
    10.         _arRaycastManager = GetComponent<ARRaycastManager>();
    11.     }
    12.  
    13.     void Start()
    14.     {
    15.         screenCenter = currentCam.ViewportToScreenPoint(new Vector3(0.5f, 0.5f));      
    16.     }
    17.  
    18.     void Update()
    19.     {
    20.         if (_arRaycastManager.Raycast(screenCenter, hits, TrackableType.Planes)) {
    21.             var hitPose = hits[0].pose;
    22.             placementIndicator.transform.SetPositionAndRotation(hitPose.position, hitPose.rotation);
    23.         }
    24.     }
    This works when I build the app in the usual manner, but I can't seem to receive the raycast back from my device. How would I go about this using this plugin?

    Thanks in advance!
     
  19. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    757
    @jpvanmuijen, ARRaycastManager is a manager for XRRaycastSubsystem. And XRRaycastSubsystem is not currently implemented in plugin.

    You can find MultiTouchRaycastExample.cs included in plugin which shows how to raycast against tracked planes and cloud points with the help of ARPlaneManager.Raycast() and ARPointCloudManager.Raycast().

    Please tell me if this will help.


    UPDATE:
    The plugin already supports ARRaycastManager. Please find the updated example attached.
     

    Attached Files:

    Last edited: Jun 24, 2020
  20. jpvanmuijen

    jpvanmuijen

    Joined:
    Aug 23, 2012
    Posts:
    13
    Thanks for your quick reply!
    I had a look at your script before, but couldn't quite figure out how to shape it the way I wanted.
    But after your post, I tweaked it a bit here and there and now it's working like a charm. I mainly had to change the return type to a Pose and make the object rotate along with the camera y-axis.
    Here's my code, if anyone is interested. The lockUnlockObject method is to lock the object in its current position via a button.
    Suggestions are welcome by the way.

    Code (CSharp):
    1.     [SerializeField] bool hitPlanes = true;
    2.  
    3.     public GameObject objectToPlace;
    4.     public Camera currentCam;
    5.  
    6.     private ARPlaneManager _arPlaneManager;
    7.     private Vector2 screenCenter;
    8.     private Pose hitPose;
    9.     private bool objectLocked;
    10.  
    11.     static List<ARRaycastHit> hits = new List<ARRaycastHit>();
    12.  
    13.     private void Awake()
    14.     {
    15.         _arPlaneManager = GetComponent<ARPlaneManager>();
    16.     }
    17.  
    18.     void Start()
    19.     {
    20.         screenCenter = currentCam.ViewportToScreenPoint(new Vector3(0.5f, 0.5f));
    21.     }
    22.  
    23.     void Update()
    24.     {
    25.         var ray = currentCam.ScreenPointToRay(screenCenter);
    26.         if (objectLocked == false)
    27.         {
    28.             var hitPose = tryHitPlanes(ray);
    29.             if (hitPose != null)
    30.             {
    31.                 objectToPlace.transform.SetPositionAndRotation(hitPose.position, Quaternion.Euler(0, currentCam.transform.eulerAngles.y, 0));
    32.             }
    33.         }
    34.     }
    35.  
    36.  
    37.     Pose tryHitPlanes(Ray ray)
    38.     {
    39.         if (hitPlanes && _arPlaneManager != null)
    40.         {
    41.             using (var hits = _arPlaneManager.Raycast(ray, TrackableType.Planes, Allocator.Temp))
    42.             {
    43.                 if (hits.IsCreated && hits.Any())
    44.                 {
    45.                     hitPose = hits.First().pose;
    46.                 }
    47.             }
    48.         }
    49.         return hitPose;
    50.     }
    51.  
    52.     public void lockUnlockObject()
    53.     {
    54.         objectLocked = !objectLocked;
    55.     }
    Thanks again, this saves me quite a bit of time!
     
    Last edited: Jun 4, 2020
    godril and WanSCAD like this.
  21. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    757
    DrSharky likes this.
  22. zulaman

    zulaman

    Joined:
    May 12, 2017
    Posts:
    26
    Excellent, I just got it from the assetStore... looks like it live now. Congrats.
     
  23. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    757
    Huge thanks for your support!
    Please take a little bit of your time and leave an honest review :)
     
  24. zulaman

    zulaman

    Joined:
    May 12, 2017
    Posts:
    26
    Absolutely. I do have a question.
    I've downloaded the plugin from the assetStore on my PC and Mac and it looks like it has different scenes.
    Mac version has ARFoundationRemote.Sender / Reciever
    PC version has several scenes like FaceReciever and FaceSender
    I can't tell which one is the most recent one and which one should I use?
    Thanks
     
  25. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    757
    The most recent one is with FaceReciever and FaceSender, this is the version where face tracking was added.
    I'll add version number in the future update to make things clear.
     
  26. leeprobert

    leeprobert

    Joined:
    Feb 12, 2015
    Posts:
    26
    Do you need to use your own plane prefabs? I get this error when running the app as 'sender' and testing the same scene as 'receiver' in the editor:
    Code (CSharp):
    1. NullReferenceException: Object reference not set to an instance of an object
    2. ARFoundationRemote.Editor.PlaneSubsystem+ARemotePlaneSubsystemProvider.GetChanges
     
  27. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    757
    leeprobert, you can use your own prefab if you want, but you can use a default one (AR Default Plane).

    But please let me clarify the intended usage of plugin:
    1. You build a sender scene (FaceSender or PlaneAndCloudPointsSender) once and keep it running on your AR device. Please do not modify sender scenes and use them as they are.
    2. Then you develop your AR app in a different scene and use a sender scene as a data source for Unity Editor.
    3. When you're ready to test your AR app on a real device, you make a new build.
    4. You continue the development or your app in your different scene and use a sender scene only as a source of data for Unity Editor (sender scene will not reflect the changes you made in your project, it will only feed AR data to Editor).

    I'll add this description to Documentation in the next version of the plugin.
    Please tell me if this instructing is clear or not.
     
  28. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    757
    Last edited: Jul 28, 2020
  29. hmkn

    hmkn

    Joined:
    Nov 7, 2019
    Posts:
    23
    I would like to send Raycast info remotely.
    Is there some way to hook raycast events on the device?
     
  30. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    757
    Last edited: Jun 24, 2020
  31. zulaman

    zulaman

    Joined:
    May 12, 2017
    Posts:
    26
    I just tried the new blendshape example and it worked perfectly for me out of the box.
    I posted a 5 star review and thanks again for this awesome package.
     
  32. hmkn

    hmkn

    Joined:
    Nov 7, 2019
    Posts:
    23
    I would propagate Raycast data from Device to Editor since tap screen is conducted on the device. Similarly, is there some way to synchronize tap event from device to editor?
     
  33. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    757
    I'm glad to hear this! Thanks for the review!

    If you don't want to spend your time writing your own implementation, you can use this solution :)
    https://forum.unity.com/threads/ar-...ar-project-in-the-editor.898433/#post-5936363

    Yes, it's possible. I'll consider adding this feature in the future.
     
  34. multimediamarkers

    multimediamarkers

    Joined:
    Feb 17, 2016
    Posts:
    44
    Does AR Remote also supports image tracking with an ImageLibrary or just plane/faces?
     
  35. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    757
    Unfortunately, Image Tracking is currently not implemented.

    Only these subsystems are supported:
    1. Face Tracking: face mesh, face pose, eye tracking, ARKit Blendshapes.
    2. Plane Tracking: horizontal and vertical plane detection, boundary vertices, raycast support.
    3. Depth Tracking: cloud points, raycast support.
    4. Camera position and rotation.
    5. Remote control over AR Session lifetime: create/destroy, enable/disable, receive tracking state.
     
  36. multimediamarkers

    multimediamarkers

    Joined:
    Feb 17, 2016
    Posts:
    44
    Ok, thanks for the quick response ... but do you have it on your backlog to implement in the future?
     
  37. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    757
    I do. Ideally, I want to implement all subsystems out there. Can't give you any ETA on Image Tracking though, but I will do my best.
     
  38. hkbook

    hkbook

    Joined:
    Sep 9, 2017
    Posts:
    3
    Hello, always failed to connect
     
  39. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    757
    Could you please check that your device is unlocked, a Sender scene (FaceSender or PlaneAndCloudPointsSender) is running and there is a "Waiting for Editor connection..." message?

    If you don't see your device in Console, please try to restart Sender scene on your device.
     
  40. SirLouLou

    SirLouLou

    Joined:
    Jan 27, 2020
    Posts:
    1
    Hi, I just buyed/downloaded your plugin. I've got a project that I'm working on for my master thesis and I use AR Core's Image Tracking in AR Foundation. So I read that Image tracking isn't viable in your plugin at the moment, but does that mean that I cannot remote access my console log when I start my scene? In other words is the plugin useful for me atm?
     
  41. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    757
    Unfortunately, there is not so much for you right now. I can make a refund so you can buy it again when Image Tracking will be added.

    You can access device console logs in Editor, but it's a Unity built-in feature. All you need is to make a Development build and connect to your device via Editor Console.
     
  42. jipsen

    jipsen

    Joined:
    May 22, 2018
    Posts:
    35
    When I run the Face Tracking on Android it doesn't use front facing camera, only back facing. Any ideas why?
     
  43. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    757
    Is there a chance you're using AR Foundation 4.0 but did not add an ARFOUNDATION_4_0_OR_NEWER define to
    Project Settings -> Player -> Scripting Define Symbols?
     
    jipsen likes this.
  44. jipsen

    jipsen

    Joined:
    May 22, 2018
    Posts:
    35
    HAH yes... That would be it. Thanks. Awesome remote by the way!!!
     
  45. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    757
    Great! Please leave an honest review after you spend some time with the plugin.
     
    jipsen likes this.
  46. jbboro

    jbboro

    Joined:
    Jul 3, 2017
    Posts:
    3
    This works great! Are there any plans to add Image Tracking support?

    Thanks.

    Edit: I see in previous posts that you don't have an ETA for Image Tracking yet. So this is just a +1 vote for it.
     
  47. hkbook

    hkbook

    Joined:
    Sep 9, 2017
    Posts:
    3

    Can I see the device in the editor, do I need to install any software on my phone?
     
  48. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    757
    Yes, the idea of the plugin is that you need to build a Sender scene once and run it on your mobile phone. Then you connect the Editor to your phone to receive AR data from it.
    Yes, I'm working on it :)
     
  49. jforder

    jforder

    Joined:
    Jun 30, 2012
    Posts:
    22
    Hey there, would it be possible to run this plugin to an iOS device (ARKit) from a PC? Feel like I know what the answer might be but fingers crossed :D Thanks
     
  50. KirillKuzyk

    KirillKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    757
    You need to make an iOS build once and then you'll be able to use the plugin as usual. But building from PC to iOS is a non-trivial task, you can use a virtual machine or hackintosh to achieve it.
     
    Last edited: Jun 15, 2020
unityunity