Search Unity

  1. Welcome to the Unity Forums! Please take the time to read our Code of Conduct to familiarize yourself with the forum rules and how to post constructively.
  2. Dismiss Notice

AR Foundation Remote | Test and debug your AR project in the Editor

Discussion in 'Assets and Asset Store' started by KyryloKuzyk, May 26, 2020.

  1. Bersaelor

    Bersaelor

    Joined:
    Oct 8, 2016
    Posts:
    110
    it seems after I updated to 2.0.27 the issue went away.
    I do have another ongoing issue though, when I start my application with AR Remote attached, I get infrequent Unity-Editor Freezes, as if there is some kind of infinite-loop. It doesn't happen when I run the app natively, or when I run it in the editor without the AR Remote attached.
    I tried attaching Visual Studio to the app running in debug mode, so I can pause and inspect where exactly the freeze is happening, but I didn't get good results. @PraetorBlue from the Unity discord helped me to figure out how to do that.
     
  2. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,070
    Have you figured out the exact cause of the problem yet? Does your project build fine without the plugin?
     
  3. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,070
    Do the freezes happen in some particular scenes? Or do they happen with any AR setup, even with the most basic one?

    The reason for small freezes may be the way the plugin works in some cases. The majority of the AR Foundation API is event-based, but some APIs are synchronous. Examples of a synchronous APIs are:
    - ARCameraManager.currentConfiguration
    - ARCameraManager.GetConfigurations()
    - ARAnchorManager.AddAnchor()
    - ARAnchorManager.TryRemoveAnchor()
    - ARRaycastManager.Raycast()

    To execute a synchronous call, the plugin blocks the main Unity thread, makes a network request to the AR Companion app, and unlocks the main thread after receiving the response. In general, these freezes are unnoticeable, but if you call synchronous API too often, you may start noticing freezes. It's impossible to debug the freeze with the external debugger because from Unity's point of view nothing is happening. Can this be an explanation of what you're experiencing?
     
  4. ironcladlou

    ironcladlou

    Joined:
    Feb 3, 2018
    Posts:
    1
    Experienced software developer new to iOS, Unity, 3D and AR space, sorry if this is a silly question and thanks for any advice...

    My goal is to capture interior spaces to produce reasonably detailed fixed point (e.g. Matterport/Street-view style) simulations or VR walkthroughs onto which I'd like to interactively project 3D models for the design phase of custom simple construction projects in the acoustical engineering space. I don't need detailed watertight, super-accurate meshes and CAD models for acoustical analysis at this point; I just need a way to rapidly iterate on placement of e.g. acoustic sound panels on walls and ceilings with realistic textures and representations of the space for planning and aesthetic purposes to produce renders.

    With my iPhone 14 Pro, Matterport produces photorealistic models that look good enough for me including decent plane detection, but of course doing anything with those models is a nightmare unless you pay them for point cloud/mesh exports, etc.

    DIY photogrammetry is an option (e.g. RealityCapture, Metashape, Apple's Object Capture API), but I haven't figured out a usable workflow/technique for that yet with the equipment I have.

    Point clouds are getting nearly good enough for me using the iPhone LiDAR and EveryPoint, Scaniverse, SiteScape, PolyCam, etc, especially when texture mapping onto them works out, which is very iffy. Not as good as I'd like yet, but an interesting approach.

    AR is something I recently started exploring by accident as a possible solution here, although my use case seems to be outside the norm. I was playing with ARKit and learned about private iOS APIs that enable recording mp4 video feeds with time-synced sensor metadata which can be used for 100% accurate playback simulations via XCode. However, the file format is private, the API to record and play back the replay files are private, etc (see https://www.ittybittyapps.com/blog/2018-09-24-recording-arkit-sessions/ for experiments on this topic). It's all coupled to an interactive Xcode workflow and can't ever be published to the App Store. Can't control the playback in any way.

    What I was trying to figure out is whether it would in theory be be possible to capture the full AR session from the iPhone (including the camera frames), replay it, and then explore the resulting state in various ways.

    So then I came across this Unity plugin, which seems to be using its own technique to capture the ARKit output (including the video feed?) and provide replays... What I'm now wondering is whether this plugin and Unity can enable something like what I'm envisioning. For example, walk through an interior (a conference room, a residential office space, a studio space), capture the space with a walkthrough, load that into a Unity scene, replay the session to the end so that the full world state is available, explore the space while sensor input is paused (as there is no more to replay), and use the plane detection (etc) to interact with the environment (e.g. place rectangular panel on a wall or ceiling). So in that final state, the sensor input would be effectively static, but perhaps it would somehow be possible to scrub back and forth through the video feed to visualize in realtime.

    Does that make any sense? Would the "movement"/interactive camera end up fixed to frames of the camera feed that was recorded? If so, no big deal — walk through and scan to map the surfaces, then back up to some predetermined perspectives to provide wide shots for renders. That sort of thing.

    Stupid? Impossible? Any advice or thoughts appreciated. It just struck me as unfortunate that the AR tooling gives the photorealism of a camera feed with all the sensor data but all the use cases assume real-time interactive applications and don't seem to enable any offline use cases.
     
  5. Mufifnman

    Mufifnman

    Joined:
    Oct 28, 2015
    Posts:
    1
    Hi all,

    Has anyone been able to get this working with Windows and Android? I don't see any tutorials with either of these devices at all.

    I am getting the companion app building, deploying, and connecting fine, but it doesn't seem to be sending basic tracking data back to the editor. I see neither the AR Camera moving around in the scene as I move the phone nor the image from my android camera (to be clear these both happen in the build an deployed ARCore android app). I notice that I am getting some d3d11 errors

    d3d11: Creating a default shader resource view with dxgi-fmt=0 for a texture that uses dxgi-fmt=28
    d3d11: failed to create 2D texture shader resource view id=217 [D3D error was 80070057]


    These don't show up if I disable the 'Enable Video' checkbox in the ARFoundationRemote options:
    upload_2022-11-1_16-25-1.png
    So it seems these have something to do with this camera, though my direct x debugging is rusty and I haven't had any luck tracking this down. Any tips would be appreciated!

    More info on setup for context:
    upload_2022-11-1_16-27-40.png
     

    Attached Files:

  6. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,070
    Windows + Android setup is supported. The setup is very similar both for iOS and Android, so the iOS tutorial is applicable for Android.

    Your issue is very similar to this bug report and may be connected to Unity 2022.1. Unfortunately, I currently don't have a Windows PC to debug this problem. If you'll have any additional details on the issue, please share them here or directly under the bug report.
     
  7. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,070
    My plugin does support session recordings. What it doesn't support is scrubbing the playback back and forth. With the plugin, it's only possible to pause or play the recording forward.

    I'm not totally sure what you're trying to achieve, but the recording will not contain all the data from the phone's sensors; instead, it will only contain the data received from the AR Foundation. For example, if your app wants to track planes, the session recording will contain the data to reproduce the scanned planes on the next scene launch. If your app scans meshes with the LiDAR sensor, the plugin will record all mesh changes produced by the ARMeshManager, and will replicate these changes on the next scene play.

    In other words, the Session Recording feature doesn't store an 'environmental snapshot'. This feature is designed to debug AR apps in a reproducible environment right in the Editor. You can pause the recording, fly with the Editor camera through your scene and debug all your scene objects. But no other additional features to what you already can achieve with the AR Foundation.
     
  8. jonkeee

    jonkeee

    Joined:
    Apr 8, 2014
    Posts:
    7
    I just installed the new Plugin 2.0 and changed my Inputsystem to (NEW)
    I'm using the XR Interaction Toolkit Plugin to interact with Trackables on ARPlanes.

    If i use the AR Conmpanion App I can not interact with these trackables anymore.
    Not via Mouse (Editor) and not via Touch on the Android Device.
    The UI is usable via Mouse (Editor) but not via Touch (Android Device).

    If i build the App onto the device ... all interaction works.

    How can I get the Touch Simulation working?

    Unity: 2021.3.10 on Windows 10
    ARTestDevice Android Tab Galaxy S7 Packages.jpg
     
  9. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,070
    1. Did you build a new AR Companion app after changing the Input Handing to Input System (New)? The AR Companion should be built with the same settings as your current AR project uses (AR Foundation packages, Input Handling, Color Space, Unity version, etc.).
    2. Do you have any errors or warning in the Console?
     
  10. jonkeee

    jonkeee

    Joined:
    Apr 8, 2014
    Posts:
    7
    Thanks for your reply!
    Reviewed all warnings
    AR Foundation Remote: please set Editor View resolution to match AR device's resolution: (1600, 2560). Otherwise, UI and other screen-size dependent features may be displayed incorrectly. Current Editor View resolution: (1440, 2560)

    I fixt it by setting the GameView Resolution to Device Resolution. Now TouchSimulation via Device works fine!
     
    KyryloKuzyk likes this.
  11. lasse5720

    lasse5720

    Joined:
    Nov 4, 2021
    Posts:
    3
    Hello

    I am using the AR Foundation 2.0 on Android, and used the following tutorial:

    When I load any scene wether its an empty AR scene or one of the examples given by the asset, my phone shows a black screen and gives the this error: d3d11: failed to create 2D texture shader resource view id=1614 [D3D error was 80070057]
    I have tried reinstalling everything, adjusted settings and tried to turn on/off the AR Camera Background Component
    To be more specific: The camera renders for a second, where everything works as it is supposed to, but then it turns to a black screen on both my phone and in Unity. The camera can still capture and render AR-related objects but the world itself is just black
     
  12. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,070
    Can you please try to install the recommended Unity version and verified AR Founaiton packages and tell me if this fixes the issue?
    If the issue still persists, please file a detailed bug report here:
    https://github.com/KirillKuzyk/AR-Foundation-Remote-support/issues/new/choose
     
    lasse5720 likes this.
  13. lasse5720

    lasse5720

    Joined:
    Nov 4, 2021
    Posts:
    3
    I installed 2021.3.6f1 and everything debug-related seems to be working now.
    Thank you :)
     
    KyryloKuzyk likes this.
  14. andreabefuen

    andreabefuen

    Joined:
    Jan 28, 2018
    Posts:
    1
    Hello,

    I have also similar problem as lasse5720 pointed. I tried to see the examples or AR Foundation Remote 2.0 but a white screen appears. It works for one second and then I got this white screen. I followed Dilmer's tutorial.
    I am using a Macbook Pro M1 and a Samsung Galaxy S22 Ultra.
    Unity version 2021.3.13f1 (Silicon LTS)



    Can u help me? thanks in advance!
     

    Attached Files:

  15. Zulks

    Zulks

    Joined:
    Oct 31, 2022
    Posts:
    2
    Hi guys, there is any way to I get a bigger image from TryAcquireLatestCpuImage(out XRCpuImage image) on an android device? the biggest image that I can take for now is 640x480. =/
     
  16. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,070
    The largest available camera CPU image depends on the device model. You can find the specifications here, it's called 'Supports multiple GPU texture resolutions':
    https://developers.google.com/ar/devices

    You can also choose desired camera resolution by using this method:
    https://docs.unity3d.com/Packages/c...etConfigurations_Unity_Collections_Allocator_

    Usage examples:
    https://forum.unity.com/threads/ar-foundation-camera-resolution.866743/#post-6374229
     
  17. Fangh

    Fangh

    Joined:
    Apr 19, 2013
    Posts:
    248
    If I want to build MyProject with Cloud Build.
    Putting "AR_COMPANION" in the define symbol in the Cloud build will
    1) Build thecompanion app with the name "MyProjectCompanionApp" ?
    2) Build the companion app with the name "MyProject" ?
    3) Build my app named with the "MyProject" ?
    4) Build my app named with the "MyProjectCompanionApp" ?

    Which one is true ? Because I have the behavior (4). Which is weird.

    upload_2023-1-23_14-38-27.png
     
  18. newguy123

    newguy123

    Joined:
    Aug 22, 2018
    Posts:
    1,234
    Version 1.4.28

    I uninstalled 1.4.27 from my project. Did some changes, then decided to install again. Noticed the new version and installed that.
    Enabled the remote in XR settings
    upload_2023-2-3_15-27-12.png

    Then installed it to device using this button:
    upload_2023-2-3_15-27-33.png

    it installed on my android device and gave me the ip to enter
    I enter this ip in settings, then run
    however, getting this errors:
    upload_2023-2-3_15-29-7.png

    I'm on same wifi as before when it was working before

    Any ideas?

    Unity 2021.3.18f1, URP, AR Foundation 5.0.3 (When it was working bhefore, the project was on 2021.3.16f1)
     
  19. WayneVenter

    WayneVenter

    Joined:
    May 8, 2019
    Posts:
    56
    @KirillKuzyk your latest update Version 2.0.28-release.0 broke plane tracking.
    When remoteing now the feathered plane is "stuck" on the display while you move the device around. (AR Camera does not move)
    It worked perfectly in previous release, with exception of the touch screeb input bug.
    Now we can't do any remote dev using the plan tracking.
    Can you provide a link to the previous build as this 2.0.28-release.0 is broken.

    The AR Camera does not move in the scene.

    See GIF's below.
    AR Foundation 5.0.3
    Unity 2021.3.17f1

    Also getting this warning, so all is good with device still :
    Using session configuration 0x1
    Requested Features: World Facing Camera, Rotation and Orientation, Plane Tracking, Auto-Focus, Raycast
    Supported Features: World Facing Camera, Rotation and Orientation, Plane Tracking, Auto-Focus, Raycast
    Requested features not satisfied: (None)
     
    Last edited: Feb 4, 2023
  20. WayneVenter

    WayneVenter

    Joined:
    May 8, 2019
    Posts:
    56
    This is a video on previous 2.0.27 release, you can see the camera moving in the scene. (Two seperate apps, so I know it's working)
     

    Attached Files:

    • s.gif
      s.gif
      File size:
      4.3 MB
      Views:
      55
    Last edited: Feb 4, 2023
  21. WayneVenter

    WayneVenter

    Joined:
    May 8, 2019
    Posts:
    56
    And here is the result after upgrading to 2.0.28, AR Camera is "frozen", no movement.
     

    Attached Files:

    • z.gif
      z.gif
      File size:
      5.5 MB
      Views:
      61
  22. WayneVenter

    WayneVenter

    Joined:
    May 8, 2019
    Posts:
    56
     

    Attached Files:

    • q.gif
      q.gif
      File size:
      5.4 MB
      Views:
      58
  23. shi_meikou

    shi_meikou

    Joined:
    Aug 5, 2022
    Posts:
    1
    Nice to meet you.

    I'm using AR Foundation Remote, but how should I display the license information in my app?
     
  24. makaka-org

    makaka-org

    Joined:
    Dec 1, 2013
    Posts:
    873
    By default, when you are purchasing an asset on the Unity Asset Store, you don't need to display license info. If the target Unity Asset has a separate license file in the root folder, read it and follow the rules.
     
  25. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,070
    Hi, I'm very sorry for the late reply.
    Did you manage to solve the issue with the frozen camera?
    The 'Using session configuration' is not a warning; it's a normal log from the AR Foundation that describes the supported session configurations.

    I see your bug report here. If you have anything to add to help find the issue, please do so.
    https://github.com/KirillKuzyk/AR-Foundation-Remote-support/issues/53
     
  26. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,070
    The (4) is the expected behavior, the plugin adds the prefix to distinguish your app from the AR Companion app. But it should build the companion app, of course, and not your app.
    Please check that you've added the ARCompanion.unity to the scene list in the UCB configuration.
    Here is the full list of steps needed to build the companion app with UCB:
     
  27. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,070
    You don't have to display the plugin's license info in your app. AR Foundation Remote is an Editor extension tool, so just make sure you've purchased a license for every developer on your team that uses the plugin, and you're good to go.
     
  28. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,070
    Do you still have this issue? The is a 'synchronous call failed' exception in your console that happens before the connection error, it may be the cause of the failing connection.
     
  29. newguy123

    newguy123

    Joined:
    Aug 22, 2018
    Posts:
    1,234
    no its sorted thanks. I didnt notice my mobile's wifi was off
     
    KyryloKuzyk likes this.
  30. makaka-org

    makaka-org

    Joined:
    Dec 1, 2013
    Posts:
    873
  31. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,070
    Unfortunately, for 2.5 years, there was little interest in Hololens and MagicLeap support. Several people were able to get the plugin somewhat working with Hololens after a lot of trial-and-error, but it wasn't much of use.

    Out of all features of these devices, the plugin only supports Device Tracking (position/orientation) and Meshing (scene reconstruction). To add proper support, I need access to actual devices. But because the interest is so low, I'm unsure if spending so much on devices is worth it.

    Besides that, Microsoft already has the Holographic Remoting app. I saw many people complaining about it, but at least it's free. It would be double as hard to compete with a free official product.

    Given all that, I'll unlikely be adding support for mentioned devices.
     
    makaka-org likes this.
  32. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,070
    PLUGIN UPDATE POST #15
    Hi, forum!

    Today I released new versions 1.4.30 and 2.0.30 that address the long-lived issue with Unity Editor freezes on Apple Silicon chips.

    It turned out, the cause of the issue is the call to Texture2D.GetNativeTexturePtr(). Please vote for this bug report so we can get an official fix soon.

    The plugin uses GetNativeTexturePtr() when receiving camera video from the AR Companion app. Eliminating this call completely is not possible because AR Foundation API requires the native texture pointer.

    In this new update I introduced caching, so Texture2D.GetNativeTexturePtr() is called as rarely as possible. Theoretically, the freeze can still happen because the problematic method call is still there, but in practice, I couldn't reproduce the freeze for a few days of stress testing.

    If you have an M1 or M2 processor, please go ahead and update to the latest version. There are also several small quality-of-life improvements in this release.

    Happy coding!
     
    Last edited: May 11, 2023
    makaka-org likes this.
  33. schmosef

    schmosef

    Joined:
    Mar 6, 2012
    Posts:
    851
    Hi, I bought this asset in December but I'm just getting around to playing with it now.

    Is there a link to the documentation somewhere? Does it get installed with the asset? I'm not seeing it.

    Edit: Never mind. I found the docs right after I posted my question.
     
    makaka-org and KyryloKuzyk like this.
  34. petey

    petey

    Joined:
    May 20, 2009
    Posts:
    1,771
    Hiya,
    I just installed the latest asset version and the available IP addresses are being cut off by the notch, also it won't rotate to to other landscape mode. Is there another way to get that detail?

    Edit - Screenshot did the trick :)
     
    Last edited: Jun 20, 2023
    KyryloKuzyk likes this.
  35. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,070
    Can you please provide a photo of the notch covering the IP addresses? Because I couldn't replicate the issue on my iPhone.
    The IP address can also be found in the WiFi settings of your phone.
     
  36. TonyHoyle

    TonyHoyle

    Joined:
    May 19, 2022
    Posts:
    2
    I just installed this morning.. finally got the app built but I'm struggling to work out how to set this up.

    The AR app runs in the editor, and indeed allows me to look around.. wierdly objects from the normal 3d scene are there (probably how things work, I've never done unity AR before) but I can move the ipad around and have it reflected on the screen.. great.

    But there's no camera at all, it's just black backgound. I've tried fiddling with the camera settings but don't really know what I'm doing.

    Every time it starts up there's an error:

    Background camera material doesn't contain property with name _textureY. Please ensure AR Companion app is running the same build target and same render pipeline as Editor.

    This says to me there's a config setting I'm missing but I've hit a blank working out what.. as far as I know (the project was setup by someone who's since left) the cameras are all default unity cameras so I'm not sure what a background camera material is or how I'd add that property..
     
  37. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,070
    @TonyHoyle AR Foundation chooses the appropriate camera background material based on the target platform. If you're using an iOS device, then switch the Editor build target to iOS in the Build Settings window. Or switch to Android if you're using Android.
    This should fix the black camera background.
    upload_2023-6-26_17-9-12.png
     
  38. Panda29

    Panda29

    Joined:
    Jul 4, 2017
    Posts:
    7
    I got this log error for the ARCore Cloud Anchor:

    AR Foundation Remote: only the current camera position is supported as parameter to EstimateFeatureMapQualityForHosting(). Please contact the AR Foundation Remote support if you need feature the map quality estimation from an arbitrary pose.

    Does anyone know how to deal with this?
     
  39. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,070
    Hi,

    The current plugin version displays this error if the ARSessionOrigin is not present in a scene or if the ARSessionOrigin.camera is null. Please check that you have those in your scene. Please also note that ARCore Cloud Anchors are designed to work only with AR Foundation 4.x.

    If you're using AR Foundation 4.x and the ARSessionOrigin.camera is setup correctly, but you still see the error, it means that you're not passing the current camera position to the EstimateFeatureMapQualityForHosting(). I can address this limitation in the next plugin version if you can describe your use case.

    I'll improve this error message in the next plugin version because the current one may be misleading.
     
    schmosef likes this.
  40. Panda29

    Panda29

    Joined:
    Jul 4, 2017
    Posts:
    7
    Hi,

    It took me a while to double-check everything to make sure the issue can be addressed quickly. Turn out that the issue come from the camera Pose I passed to the EstimateFeatureMapQualityForHosting() which is made with the local position and local rotation of the camera. Just to make sure, I changed my code to pass the world position and world rotation, and that error no longer shows up. I made it that way because my ARSessionOrigin transform is not zero for every value, so passing the world position and world rotation of the camera to the Estimate method will make the map quality stay insufficient forever. Then I found out that ARCoreCloudAnchorsReceiver.EstimateFeatureMapQualityForHosting() compares the Pose argument provided with the camera world position and world location, which in my case will always be false due to the local transform of the camera no longer having the same values as the world transform.

    Secondly, I found another limitation. When resolving a cloud anchor, correct me if I am wrong, once it successful seems like ARFoundationRemote only returns the cloudAnchorId, but I also need the details about the ARCloudAnchor object. When I tried to get the ARCloudAnchor position and rotation, it returns zero value only, but I checked the ARCloudAnchor in the inspector, and the transform was not zero all the way. FYI, this is not an issue in the build.

    Hopefully, I provided good details for improving those limitations, if you need more please let me know. Or if you prefer me to write bug reports, I am happy to do so.

    Thanks for your support! Have a nice day!
     
    KyryloKuzyk likes this.
  41. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,070
    @Panda29 You're right, ARAnchorManagerExtensions.EstimateFeatureMapQualityForHosting() treats the pose relatively to the ARSessioinOrigin. But the AR Foundation Remote compares the pose with the camera's world position, which is wrong.
    I fixed the issue in version 2.0.31-release.2. Please update and tell me if it fixed the issue for you. Thanks for the bug report!

    Hmm, the position/rotation values reported in the code should match the values you see in the Inspector. In fact, ARCloudAnchor.localPosition/localRotation should match the ARCloudAnchor.pose because these properties are updated in sync. I just checked it myself, and it worked as expected. Could you please show me the code you're using to get the position?
    Fixed, thank you!
     
    Last edited: Aug 1, 2023
    Panda29 likes this.
  42. Panda29

    Panda29

    Joined:
    Jul 4, 2017
    Posts:
    7
    Awesome! I confirmed that the new version has fixed the camera pose issue.

    About the resolving cloud anchor issue, for your convenience to debug, the code I am using is similar to the PersistantCloudAnchor sample, which uses the new ResolveCloudAnchorAsync() and then waits for the Result from the Promise. FYI, I did not use the persistent feature and only kept the cloud anchor live for one day. I tested on the sample and tried to print out the ARCloudAnchor pose and also the world position/rotation after successfully resolving, in the build, it resolved the cloud anchor correctly and printed out the correct values. However, when running with ARFoundationRemote, the good news is the cloud anchor still resolved precisely, but the values printed out for the pose are all default and didn't match with what is shown in the Inspector
     
    KyryloKuzyk likes this.
  43. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,070
    @Panda29 I was able to replicate the issue you described. The issue was that ARCloudAnchor's pose was updated with delay, so the values were wrong (default) right after the cloud anchor is resolved. This explains why you saw default values right after the cloud anchor resolution, but because the delay is small, values in the Inspector were already correct by the time you looked at them.

    I fixed the issue in version 2.0.31-release.3. Now the cloud anchor's state is updated in sync with its pose and the correct position and rotation are reported immediately. Huge thanks for reporting the issue!
     
    Panda29 likes this.
  44. Panda29

    Panda29

    Joined:
    Jul 4, 2017
    Posts:
    7
    KyryloKuzyk likes this.
  45. lilzubr

    lilzubr

    Joined:
    Oct 3, 2023
    Posts:
    3
    Hi! I'm using:

    Unity 2022.3.10f1
    AR Foundation 5.0.7
    URP 14.0.8
    AR Foundation Remote 2.0.33-release.0

    I am getting the black screen issue instead of the camera feed on play in Engine/iPad. Weirdly, on first run of the companion app is does work, but as soon as you stop and press play again it doesn't work. You have to uninstall the plugin, reinstall and then build the companion app for it to work once. I have followed all instructions on setting up URP. Just wondered if theres any way around this yet? If not, what are the recommended versions of Unity/URP/AR Foundation I should be using.

    Thanks!
     
  46. KyryloKuzyk

    KyryloKuzyk

    Joined:
    Nov 4, 2013
    Posts:
    1,070
    Just tested your setup myself and it seems like I was able to reproduce a similar issue to yours. If the 'Reload Domain' setting is disabled, then all subsequence scene lauches after the last domain reload will produce this error, and the camera will not work:
    Code (CSharp):
    1. ArgumentNullException: Value cannot be null.
    2. Parameter name: mesh
    3. UnityEngine.Rendering.CommandBuffer.DrawMesh (UnityEngine.Mesh mesh, UnityEngine.Matrix4x4 matrix, UnityEngine.Material material, System.Int32 submeshIndex, System.Int32 shaderPass, UnityEngine.MaterialPropertyBlock properties) (at /Users/bokken/build/output/unity/unity/Runtime/Export/Graphics/RenderingCommandBuffer.cs:464)
    4. UnityEngine.Rendering.CommandBuffer.DrawMesh (UnityEngine.Mesh mesh, UnityEngine.Matrix4x4 matrix, UnityEngine.Material material, System.Int32 submeshIndex, System.Int32 shaderPass) (at /Users/bokken/build/output/unity/unity/Runtime/Export/Graphics/RenderingCommandBuffer.cs:469)
    5. UnityEngine.Rendering.CommandBuffer.DrawMesh (UnityEngine.Mesh mesh, UnityEngine.Matrix4x4 matrix, UnityEngine.Material material, System.Int32 submeshIndex) (at /Users/bokken/build/output/unity/unity/Runtime/Export/Graphics/RenderingCommandBuffer.cs:475)
    6. UnityEngine.Rendering.CommandBuffer.DrawMesh (UnityEngine.Mesh mesh, UnityEngine.Matrix4x4 matrix, UnityEngine.Material material) (at /Users/bokken/build/output/unity/unity/Runtime/Export/Graphics/RenderingCommandBuffer.cs:481)
    7. UnityEngine.XR.ARFoundation.ARBackgroundRendererFeature+ARCameraBackgroundRenderPass.Execute (UnityEngine.Rendering.ScriptableRenderContext context, UnityEngine.Rendering.Universal.RenderingData& renderingData) (at ./Library/PackageCache/com.unity.xr.arfoundation@5.0.7/Runtime/ARFoundation/ARBackgroundRendererFeature.cs:162)
    8. UnityEngine.Rendering.Universal.ScriptableRenderer.ExecuteRenderPass (UnityEngine.Rendering.ScriptableRenderContext context, UnityEngine.Rendering.Universal.ScriptableRenderPass renderPass, UnityEngine.Rendering.Universal.RenderingData& renderingData) (at ./Library/PackageCache/com.unity.render-pipelines.universal@14.0.8/Runtime/ScriptableRenderer.cs:1490)
    9. UnityEngine.Rendering.Universal.ScriptableRenderer.ExecuteBlock (System.Int32 blockIndex, UnityEngine.Rendering.Universal.ScriptableRenderer+RenderBlocks& renderBlocks, UnityEngine.Rendering.ScriptableRenderContext context, UnityEngine.Rendering.Universal.RenderingData& renderingData, System.Boolean submit) (at ./Library/PackageCache/com.unity.render-pipelines.universal@14.0.8/Runtime/ScriptableRenderer.cs:1446)
    10. UnityEngine.Rendering.Universal.ScriptableRenderer.Execute (UnityEngine.Rendering.ScriptableRenderContext context, UnityEngine.Rendering.Universal.RenderingData& renderingData) (at ./Library/PackageCache/com.unity.render-pipelines.universal@14.0.8/Runtime/ScriptableRenderer.cs:1222)
    11. UnityEngine.Rendering.Universal.UniversalRenderPipeline.RenderSingleCamera (UnityEngine.Rendering.ScriptableRenderContext context, UnityEngine.Rendering.Universal.CameraData& cameraData, System.Boolean anyPostProcessingEnabled) (at ./Library/PackageCache/com.unity.render-pipelines.universal@14.0.8/Runtime/UniversalRenderPipeline.cs:658)
    12. UnityEngine.Rendering.Universal.UniversalRenderPipeline.RenderCameraStack (UnityEngine.Rendering.ScriptableRenderContext context, UnityEngine.Camera baseCamera) (at ./Library/PackageCache/com.unity.render-pipelines.universal@14.0.8/Runtime/UniversalRenderPipeline.cs:824)
    13. UnityEngine.Rendering.Universal.UniversalRenderPipeline.Render (UnityEngine.Rendering.ScriptableRenderContext renderContext, System.Collections.Generic.List`1[T] cameras) (at ./Library/PackageCache/com.unity.render-pipelines.universal@14.0.8/Runtime/UniversalRenderPipeline.cs:369)
    14. UnityEngine.Rendering.RenderPipeline.InternalRender (UnityEngine.Rendering.ScriptableRenderContext context, System.Collections.Generic.List`1[T] cameras) (at /Users/bokken/build/output/unity/unity/Runtime/Export/RenderPipeline/RenderPipeline.cs:52)
    15. UnityEngine.Rendering.RenderPipelineManager.DoRenderLoop_Internal (UnityEngine.Rendering.RenderPipelineAsset pipe, System.IntPtr loopPtr, UnityEngine.Object renderRequest, Unity.Collections.LowLevel.Unsafe.AtomicSafetyHandle safety) (at /Users/bokken/build/output/unity/unity/Runtime/Export/RenderPipeline/RenderPipelineManager.cs:132)
    16. UnityEngine.GUIUtility:ProcessEvent(Int32, IntPtr, Boolean&) (at /Users/bokken/build/output/unity/unity/Modules/IMGUI/GUIUtility.cs:190)
    This error originates from the ARBackgroundRendererFeature.cs and appeared in AR Foundation 5. The current workaround is to enable the 'Reload Domain' setting.
    Using the latest recommended versions is typically the safest decision, so you're doing everything right.

    Please also check that you've completed all these URP setup steps:
    https://forum.unity.com/threads/arf...k-screen-and-no-tracking.915527/#post-6374280
     
    lilzubr likes this.
  47. lilzubr

    lilzubr

    Joined:
    Oct 3, 2023
    Posts:
    3

    Hi,
    Thanks for the reply. I have triple checked all my URP settings and they're all set up correctly (ar scenes are working on when built). I didn't, however, have 'Reload Domain' enabled. Unfortunately, this didn't help my issue and am still getting the blank camera issue. I have tried everything and am testing with your examples.

    The error I am consistently getting is:
    Code (CSharp):
    1. AssertionException: Assertion failure. Value was Null
    2. Expected: Value was not Null
    3. UnityEngine.Assertions.Assert.Fail (System.String message, System.String userMessage) (at /Users/bokken/build/output/unity/unity/Runtime/Export/Assertions/Assert/AssertBase.cs:29)
    4. UnityEngine.Assertions.Assert.IsNotNull[T] (T value, System.String message) (at /Users/bokken/build/output/unity/unity/Runtime/Export/Assertions/Assert/AssertNull.cs:50)
    5. UnityEngine.Assertions.Assert.IsNotNull[T] (T value) (at /Users/bokken/build/output/unity/unity/Runtime/Export/Assertions/Assert/AssertNull.cs:38)
    6. ARFoundationRemote.RuntimeEditor.MeshingReceiverNative.OnEnable () (at ./Library/PackageCache/com.kyrylokuzyk.arfoundationremote@ee1d1994ed58/RuntimeEditor/Meshing/MeshingReceiverNative.cs:46)
    7. UnityEngine.GameObject:AddComponent()
    8. ARFoundationRemote.Runtime.Utils:CreatePersistentObject() (at ./Library/PackageCache/com.kyrylokuzyk.arfoundationremote@ee1d1994ed58/Runtime/Utils/Utils.cs:35)
    9. ARFoundationRemote.RuntimeEditor.MeshingReceiverNative:initOnLoad() (at ./Library/PackageCache/com.kyrylokuzyk.arfoundationremote@ee1d1994ed58/RuntimeEditor/Meshing/MeshingReceiverNative.cs:38)
    I am also going to post my URP/editor settings just in case that helps.

    Screenshot at Oct 18 11-35-06.png URP_Graphics.png URP_Pipeline Asset.png URP_Quality.png URP_Render Data.png

    If theres anything else you can suggest trying I would be very grateful. This could save me so much time in the future! :)

    Thanks.
     
  48. Araziel666

    Araziel666

    Joined:
    Aug 5, 2023
    Posts:
    1
    Does anyone know the error message
    Code (CSharp):
    1. AR Foundation Remote Companion App: Assert: OPENGL NATIVE PLUG-IN ERROR: GL_OUT_OF_MEMORY: Not enough memory left to execute command
    If I start my game in the editor, the whole screen of my app is full of this message and the app also freezes most of the times.
    I have Google Pixel 6 and also did a factory reset