A Unity ID allows you to buy and/or subscribe to Unity products and services, shop in the Asset Store and participate
in the Unity community.
Now in Beta! Get 1:1 live lessons on any Unity topic or help troubleshooting your project – Connect with an expert on Unity Live Help
Discussion in 'Handheld AR' started by mdurand, May 15, 2019.
he first prefab is coming again instead of second prefab
Is there a predictable date for support of 3D Object Tracking in arcore?
Appreciate the effort but please put more resources into getting these basics working asap! It would be nice to not have to waste so much time building constantly due to the lack of sufficient simulation tools.
Are there tongueOut and Eye Gaze Tracking examples for ARFoundation available? It was included in the ARKIt Plugin (see link)l
Especially when 8th wall has it working!
@Unity: have you noticed that the sloth example is missing eyeSquint_R? There's only eyeSquint_L
I cannot stress this more. I can't work with ARFoundation + ARKit. I have a very specific user-flow which worked fine with Unity's bitbucket ARKit repo but can't with ARFoundation. I have wasted 3 weeks into this. I can see some potential when it comes to scaling but I have to UX is horrible. Too much things have been simplified (IMO) which should have never been. On top of it, I don't see Unity helping either. I have posted several questions in Unity Forum for no reply.
I have no choice but sticking to Unity's ARKit bitbucket repo & working with just those API callbacks.
It's a great job the Unity team is doing but the benefit of that great work quickly disappears if it takes tons of unnecessary hour to develop AR with Unity. Yes tons, really. Currently no remote is the biggest hurdle to develop AR with Unity, hopefully this gets the focus it deserves with the many surveys going around
just use 8th wall if you don't need object tracking. My boss doesn't want to use it for some reason, so I have to cope(((
Issue: For some reason the package "email@example.com" is not added automatically to manifest.json. This leads to issues when using assembly definition files because the asmdef of the above package cannot be referenced (needed at least for TrackableType in ARRaycastManager::Raycast). I checked with the ARFoundation samples and their manifest also doesn't contain it.
Apart from that I also hope for the remote feature as soon as possible, would be my #1 prio.
Can someone promt me a way of using EnvironmentalHDRWithReflections from the latest ARCore release? Light estimation works on ios in ARFoundation 2.1 but doesn't work with android despite being present in ARCore unity samples. Can I somehow access native ARCore session bypassing ArF to receive the cubemap?
Cubemap cubemap = m_NativeSession.LightEstimateApi.GetReflectionCubemap(
Just wondering, I'm working on a small app utilizing image tracking on an AR Core device, and tracking seems to work fine, but my prefab jitters all over the place whenever I start tracking. After looking away, and trying again a few times, it eventually seems to calm down, but I'm not sure what I can do to fix that crazy jitter. I'm not sure if it's my tracking image, or not, as I can't seem to find any sort of value representing how well it can track, like I can with just the AR Core system. Anyone have any advice or suggestions?
Yeah, don't do that. Image tracking is very unstable -- use the image to get a position and rotation, and send your object there with code, but once your object is there it shouldn't be updated by image tracking again. DO NOT parent your objects to your markers. Vuforia started the practice when SLAM tracking wasn't available and it's been carried into every AR image target demo since, but it's straight up bad practice.
8th wall's pricing structure is near predatory. Vuforia has object tracking, should be better anyways.
it's free for unity, Vuforia is crazy expensive in it's turn and doesn't have remote for ARKit/ARCore
Hello guys. I am testing the image tracking with ARKit.
I used the following:
- ARFoundation 2.1.0
- ARkit 2.1.0
- Unity 2019.1.9
The tracking is not working.
I tested with ARCore and it is working pretty well.
I don't get any error.
Try upgrading you ARKit XR Plugin package to 2.1.0-preview.3, AR Subsystems to 2.1.0 preview.1 and ARFoundation to 2.1.0-preview.3 . Image tracking was added there according to changelog. Beware that upgrading to 2.2 may lead to build failure with older ARFoundation package.
Actually I solved by myself. It was a setting in Xcode, I re-linked the image database there. For some reason it doesn't load the file.
Dipped my toe in to ARFoundation recently as the iOS ARKit repo was depreciated. I very much appreciate the work going in to the cross platform solution, however I feel that while there are some benefits to being able to publish on both droid and iOS, theres an "AR arms race" going here and having a separate solution per platform does still carry some merit; as both platforms continue to out gun each other, new and existing features are withheld for compatibility in ARFoundation.
In separate solutions we benefited from the base solution for the platform which was feature complete.
On your ARFoundation documentation page you list what it can do... plane detection, face detection etc. If these are not across both ARKit and ARCore (talking about Object Recognition ARKit only) please say so!!!! I now look like a complete idiot for suggesting a solution that won't work . You've listed it as ARKit only on this thread which is helpful but everything links to the documentation from the package manager so I never saw this .
also 100% agree with hawken above, love the work going into single cross platform solution but the speed of development from the platform holders is outpacing you guys and we're losing features
I prefer this cross-platform solution. The last few projects I had to develop for both ARKit and ARCore simultaneously and it was a right nightmare juggling both platforms together, and getting them to work and look identical. Thank heavens for ARFoundation.
I would like to host an image library for 2D image tracking on a remote server and pull it during runtime. Is something like this possible? Or is there a way to add more image targets without rebuilding the project?
In case you may not have looked into addressables? https://docs.unity3d.com/Packagesfirstname.lastname@example.org/manual/index.html
I haven't heard of this yet, thanks. Does this mean that I can make the ReferenceImageLibrary an addressable and load it at runtime?
Also, will the size of the Reference Image Library affect the speed / reliability of the image recognition? For example, if I have an image library of 10 images vs one with 1000 images. Thanks!
Hi guys,I want to know exactly what the ReferencePoint does in ARFoundation,because we do not need it to bind object in AR space as show in the PlaceOnPlane.cs scripts,
so what it is does ? why and when to use it ? thank you .
Hi, @tdmowrer ,
Is there any alternative way of load dynamic images and reference objects at runtime using native code?
In UnityARkitPlugin 1.5 and 2.0 there was this native code, but for ARKit 3 seems it is not possible.
Yes, everything about the new API's has been made ridiculously complicated.
The hello world "Simple AR" uses 10 components. Are you serious?
No remote. No in editor testing. But hey, at least we can load a world map on iOS only. FFS.
Who cares about how long it actually takes developers to make stuff, as long as you keep checking off features on some milestone schedule somewhere.
Man I wish the old ARInterface still worked, simple, straightforward, extremely fast dev cycle.
Even simple things like seeing an in-editor version of your plane object have been broken. It's all calculated now at runtime, cause apparently they didn't think it would be useful to be able to preview things in the editor?
There's zero thoughts put into dev workflow with this new API. One of the most overwrought and least productive API's I've ever seen. Boilerplate everywhere, 60 different components, and total lack of decent documentation. Sounds fun.
Like you guys do realize this is a plugin for UNITY right? An engine that takes like 30s-1m to deploy onto device? vs native, which is like 5 seconds?
Do you guys also realize that debugging Android with Unity is slow and cumbersome process?
Maybe you should actually care about the fact that round-tripping is expensive, and quick debugging important, and therefore it's extremely high priority that we can test things on the desktop and in editor.
Despite "hearing" me a few mths ago, it appears no efforts at all have been made in this area. Thanks for "listening".
Also, you guys might want to stop ignoring you community, and actually get AR Remote working again:
When can we expect the next release?
After updating to ARFoundation 2.1 (or 2.2 preview) the camera view is all black. Tracking is working, feature points do appear, but the background is all black.