A Unity ID allows you to buy and/or subscribe to Unity products and services, shop in the Asset Store and participate
in the Unity community.
Discussion in 'Assets and Asset Store' started by KirillKuzyk, May 26, 2020.
Thank you for reporting the bug!
Please update to version 2.0.4 for the fix.
When the texture is unpacked to the memory, its size can easily increase tenfold.
I've just increased the max network message size. Please update to version 2.0.4 and tell me if it works for you.
The newest version of the plugin now supports Occlusion CPU images.
This feature is available in both AR Foundation Editor Remote and AR Foundation Remote 2.0.
Your feedback is very appreciated
I have confirmed that the bug has been fixed in version 2.0.4.
Thank you for quick response!
Hey , thanks alot for the Session Recording feature. Is it possible to record a session on the phone without being connected to the Unity Editor? Let's say I go to a specific place, record the session on my phone with the corresponding file and play it back in the Unity Editor.
i have just installed the ar remote asset. but i have a problem with location servies example. although i enabled location in my device and also added ENABLE_AR_FOUNDATION_REMOTE_LOCATION_SERVICES
to the scripting define symbols i get this error message:
No, recording sessions "in the field" is not supported yet. I have this feature on my roadmap, but I currently don't know how to make it right.
For example, ARKit and ARCore can record sessions without being connected to a computer. But their recording implementation covers only the simplest case: the session configuration is fixed (only planes and cloud points) and can't be changed during the recording process.
My implementation is far more versatile. With my plugin, you can record all supported features and you can change session configuration during the recording. For example, you can start with plane tracking on a world-facing camera, then switch to face tracking on a user-facing camera, then switch back to the work-facing camera and enable image tracking. To support this, two-way communication is happening between the Unity Editor and the AR Companion app while you record a session. So to support the "in the field" recording, I have to simulate the behavior of the target AR app and the sequence of AR Foundation API calls it produces. There are two possible ways to achieve this:
First, make a recording of a user's app with an AR device connected to the computer to capture its behavior and session configuration changes. Then, use the recorded behavior "in the field". This method is the most flexible, but it would be very hard to explain to users.
Or let the users choose from the pre-defined list of configurations and sacrifice the flexibility of the previous method.
I'm sorry if this explanation was overloaded with technical details, I just wanted to say that implementing in-the-field recordings is hard and I don't want to ship a half-baked feature until I come with a good solution.
Please make sure you made a new AR Companion build after adding the ENABLE_AR_FOUNDATION_REMOTE_LOCATION_SERVICES define.
I had to cover the Location Services feature under this define because this feature requires additional app permissions. And this is not suitable for all of my customers because not all apps use GPS.
works after new build!! thanks!
Ok, so I filed a bug report and the Unity team has confirmed the issue with BinaryFormatter in Unity 2021.2.
This issue prevents the plugin from working on iOS in Unity 2021.2 and it's possible to track its status on the issue tracker.
Thanks for the tip, in some case I have some drift with object position like you se below. (after scan large zone and back to )
How can I prevent drift like this ?
Do you encounter this behavior in a real build or in the Unity Editor?
Currently, my plugin doesn't synchronize the camera video with the camera pose. And because the camera pose is updated more frequently than the camera video, you may see the discrepancy. I'll soon add a setting so users can sacrifice the camera pose update frequency and keep it in sync with the video.
Hi, in real build
Please check that you place your augmented objects under the ARSessionOrigin object.
But it also may be the limitation of ARKit. It can lose precision if there are not enough feature points or bad lighting.
What I would like to do is use the AR Companion app with the device GPS location. I'm not even sure if this is possible TBH, but looks like it is?
I have added the ENABLE_AR_FOUNDATION_REMOTE_LOCATION_SERVICES define symbol in both the Standalone and iOS (building on a Mac to iPhone) player settings and rebuilt the companion app but when running my scene a location of 0,0 is reported as my GPS coordinates. Also, permission to use location when first launching the companion app does not show which means to me that location services are not a part of the companion app build.
Is it possible to get the GPS location of the phone using the companion app? If so, what am I doing wrong here?
Does the LocationServicesExample.unity scene work for you?
The GPS will ask permission and will start to return the correct values only after the call to Input.location.Start().
Please also make sure you've added this using statement on top of your script:
using Input = ARFoundationRemote.Input;
It does work. I didn't see that example scene there. Doh!
Thanks for the help!
I followed the steps in tutorial video "Unity AR Foundation Editor Remote Plugin Now Available !".
But the pose of my sloth is frozen in BlendshapeExampe scene view.
And then I switch to game view again it looks like it's motion will speed up for a while.
Could anybody tell me how to remain tracing in scene view? just like the tutorial video
Please align the Scene and Game windows side-by-side, so both windows are visible. The plugin can only send updates if the Game window is visible.
Which version of ARFoundation Remote would work with this constellation of Packages in Unity 2019.4:
Is this possible at all?
I would like to try rebuilding a copy of my project in 2019.4 in pursuit of a solution for my segmentation fault issue and I am bound to these early Versions by another package. I cannot update AR Froundation an ARCore 3.0.1 as demanded by the latest version of AR Foundation Remote.
Unfortunately, the 3.0.1 requirement was always there, even in the first version of the plugin. There were so many breaking changes between AR Foundation 2.1 and 3.0, so I decided from day one it's not worth it to support AR Foundation older than 3.0. I would like to add that 3.0.1 is only a minimum requirement and any newer version will work with the plugin.
Regarding GitHub - googlesamples/arcore-depth-lab: ARCore Depth Lab is a set of Depth API samples that provides assets using depth for advanced geometry-aware features in AR interaction and rendering. (UIST 2020)
Does the remoting tool support the full ARCore depth feature set / samples listed in the repo, incl. those not yet available with the AR Foundation SDK? Thanks!
A more correct question would be "can googlesamples/arcore-depth-lab repo work in the Editor?" And the answer is "partially".
My plugin does support the Depth API provided by the AR Foundation, including Depth CPU images as of version 1.4.12-release.0. But the arcore-depth-lab repo uses Android-specific shaders that can't work in the Editor. So only OrientedReticle and AvatarLocomotion scenes work in the Editor with the help of AR Foundation Remote.
Also, there are few minor fixes you need to apply for the arcore-depth-lab to work in the Editor:
Remove the #if !UNITY_EDITOR from the DepthDataSourceConfig.cs.
Delete or comment out the whole Recorder.cs file because ARCore Recording API is not supported by my plugin.
Two additional questions:
Why does the AR Foundation Remote restrict the number of XRCpuImage instances existing at any given time to only 1? I can build and run a project with several XRCpuImages living at the same time. So where does the limitation in the Remote come from?
Is there a way to wait for the arrival of an XRCpuImage from the companion app before calling code that relies on the image to be ready? I.e. some bool that I can use for WaitUntil in a coroutine – or something along those lines? Higher resolution images (720p and up) take too long to arrive, which blocks me from testing with them in the Remote.
1. This limitation comes from the network delay and the fact that the plugin acquires the CPU image lazily. The first time you request a CPU image by calling a TryAcquireCpuImage(), the plugin tells the AR Companion app to start sending CPU images. Only after the roundtrip to the companion app, the TryAcquireCpuImage() method will start to return true.
To address this limitation, I have to block the Editor main thread and wait for the CPU image to be downloaded, but this would be a horrible user experience and will make Unity Editor's framerate unbearable. So I decided to introduce the limitation of one XRCpuImage at a time.
2. I assume you're speaking about the XRCpuImage.Convert(), right? There is currently no way to check if the plugin finished downloading the CPU image before making synchronous image conversion. But in a real build, it's also not possible to tell in advance with 100% confidence that a call to TryAcquireCpuImage() followed by XRCpuImage.Convert() will succeed. In a real build, TryAcquireCpuImage() can return false at any time, and XRCpuImage.Convert() can throw an exception at any time.
One possible solution is to use XRCpuImage.ConvertAsync() method (coroutine or callback version). By using the async conversion method, you'll get notified as soon as the image is ready.
After installing all stuff i got an errror
AssertionException: xrGeneralSettings != null
Assertion failure. Value was Null
Expected: Value was not Null
UnityEngine.Assertions.Assert.Fail (System.String message, System.String userMessage) (at /Users/bokken/buildslave/unity/build/Runtime/Export/Assertions/Assert/AssertBase.cs:29)
UnityEngine.Assertions.Assert.IsNotNull (UnityEngine.Object value, System.String message) (at /Users/bokken/buildslave/unity/build/Runtime/Export/Assertions/Assert/AssertNull.cs:58)
UnityEngine.Assertions.Assert.IsNotNull[T] (T value, System.String message) (at /Users/bokken/buildslave/unity/build/Runtime/Export/Assertions/Assert/AssertNull.cs:46)
ARFoundationRemote.Runtime.Global.isPluginEnabledInXRManagementWindow () (at Library/PackageCache/com.kyrylokuzyk.arfoundationremote@20c19dfb4070-1631522281000/Runtime/Utils/Global.cs:32)
ARFoundationRemote.Runtime.Global.IsPluginEnabled () (at Library/PackageCache/com.kyrylokuzyk.arfoundationremote@20c19dfb4070-1631522281000/Runtime/Utils/Global.cs:25)
ARFoundationRemote.Editor.AnchorSubsystem.RegisterDescriptor () (at Library/PackageCache/com.kyrylokuzyk.arfoundationremote@20c19dfb4070-1631522281000/Editor/Subsystems/AnchorSubsystem.cs:12)
Thanks! The plugin is really awesome!
This error can appear if the 'XR Plugin Management' window has never been opened.
Please make sure to enable the 'AR Foundation Remote' provider:
I'll fix the error in the next version and will display a tutorial message. Thank you for reporting the bug!